Jan 21 17:10:38 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 21 17:10:38 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 21 17:10:38 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 17:10:38 localhost kernel: BIOS-provided physical RAM map:
Jan 21 17:10:38 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 21 17:10:38 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 21 17:10:38 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 21 17:10:38 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 21 17:10:38 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 21 17:10:38 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 21 17:10:38 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 21 17:10:38 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 21 17:10:38 localhost kernel: NX (Execute Disable) protection: active
Jan 21 17:10:38 localhost kernel: APIC: Static calls initialized
Jan 21 17:10:38 localhost kernel: SMBIOS 2.8 present.
Jan 21 17:10:38 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 21 17:10:38 localhost kernel: Hypervisor detected: KVM
Jan 21 17:10:38 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 21 17:10:38 localhost kernel: kvm-clock: using sched offset of 3267485900 cycles
Jan 21 17:10:38 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 21 17:10:38 localhost kernel: tsc: Detected 2799.998 MHz processor
Jan 21 17:10:38 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 21 17:10:38 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 21 17:10:38 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 21 17:10:38 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 21 17:10:38 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 21 17:10:38 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 21 17:10:38 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 21 17:10:38 localhost kernel: Using GB pages for direct mapping
Jan 21 17:10:38 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 21 17:10:38 localhost kernel: ACPI: Early table checksum verification disabled
Jan 21 17:10:38 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 21 17:10:38 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:10:38 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:10:38 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:10:38 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 21 17:10:38 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:10:38 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:10:38 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 21 17:10:38 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 21 17:10:38 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 21 17:10:38 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 21 17:10:38 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 21 17:10:38 localhost kernel: No NUMA configuration found
Jan 21 17:10:38 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 21 17:10:38 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 21 17:10:38 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 21 17:10:38 localhost kernel: Zone ranges:
Jan 21 17:10:38 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 21 17:10:38 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 21 17:10:38 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 21 17:10:38 localhost kernel:   Device   empty
Jan 21 17:10:38 localhost kernel: Movable zone start for each node
Jan 21 17:10:38 localhost kernel: Early memory node ranges
Jan 21 17:10:38 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 21 17:10:38 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 21 17:10:38 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 21 17:10:38 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 21 17:10:38 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 21 17:10:38 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 21 17:10:38 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 21 17:10:38 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 21 17:10:38 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 21 17:10:38 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 21 17:10:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 21 17:10:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 21 17:10:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 21 17:10:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 21 17:10:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 21 17:10:38 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 21 17:10:38 localhost kernel: TSC deadline timer available
Jan 21 17:10:38 localhost kernel: CPU topo: Max. logical packages:   8
Jan 21 17:10:38 localhost kernel: CPU topo: Max. logical dies:       8
Jan 21 17:10:38 localhost kernel: CPU topo: Max. dies per package:   1
Jan 21 17:10:38 localhost kernel: CPU topo: Max. threads per core:   1
Jan 21 17:10:38 localhost kernel: CPU topo: Num. cores per package:     1
Jan 21 17:10:38 localhost kernel: CPU topo: Num. threads per package:   1
Jan 21 17:10:38 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 21 17:10:38 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 21 17:10:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 21 17:10:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 21 17:10:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 21 17:10:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 21 17:10:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 21 17:10:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 21 17:10:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 21 17:10:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 21 17:10:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 21 17:10:38 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 21 17:10:38 localhost kernel: Booting paravirtualized kernel on KVM
Jan 21 17:10:38 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 21 17:10:38 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 21 17:10:38 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 21 17:10:38 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 21 17:10:38 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 21 17:10:38 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 21 17:10:38 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 17:10:38 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 21 17:10:38 localhost kernel: random: crng init done
Jan 21 17:10:38 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 21 17:10:38 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 21 17:10:38 localhost kernel: Fallback order for Node 0: 0 
Jan 21 17:10:38 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 21 17:10:38 localhost kernel: Policy zone: Normal
Jan 21 17:10:38 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 21 17:10:38 localhost kernel: software IO TLB: area num 8.
Jan 21 17:10:38 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 21 17:10:38 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 21 17:10:38 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 21 17:10:38 localhost kernel: Dynamic Preempt: voluntary
Jan 21 17:10:38 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 21 17:10:38 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 21 17:10:38 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 21 17:10:38 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 21 17:10:38 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 21 17:10:38 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 21 17:10:38 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 21 17:10:38 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 21 17:10:38 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 17:10:38 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 17:10:38 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 17:10:38 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 21 17:10:38 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 21 17:10:38 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 21 17:10:38 localhost kernel: Console: colour VGA+ 80x25
Jan 21 17:10:38 localhost kernel: printk: console [ttyS0] enabled
Jan 21 17:10:38 localhost kernel: ACPI: Core revision 20230331
Jan 21 17:10:38 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 21 17:10:38 localhost kernel: x2apic enabled
Jan 21 17:10:38 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 21 17:10:38 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 21 17:10:38 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 21 17:10:38 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 21 17:10:38 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 21 17:10:38 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 21 17:10:38 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 21 17:10:38 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 21 17:10:38 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 21 17:10:38 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 21 17:10:38 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 21 17:10:38 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 21 17:10:38 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 21 17:10:38 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 21 17:10:38 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 21 17:10:38 localhost kernel: x86/bugs: return thunk changed
Jan 21 17:10:38 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 21 17:10:38 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 21 17:10:38 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 21 17:10:38 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 21 17:10:38 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 21 17:10:38 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 21 17:10:38 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 21 17:10:38 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 21 17:10:38 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 21 17:10:38 localhost kernel: landlock: Up and running.
Jan 21 17:10:38 localhost kernel: Yama: becoming mindful.
Jan 21 17:10:38 localhost kernel: SELinux:  Initializing.
Jan 21 17:10:38 localhost kernel: LSM support for eBPF active
Jan 21 17:10:38 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 21 17:10:38 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 21 17:10:38 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 21 17:10:38 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 21 17:10:38 localhost kernel: ... version:                0
Jan 21 17:10:38 localhost kernel: ... bit width:              48
Jan 21 17:10:38 localhost kernel: ... generic registers:      6
Jan 21 17:10:38 localhost kernel: ... value mask:             0000ffffffffffff
Jan 21 17:10:38 localhost kernel: ... max period:             00007fffffffffff
Jan 21 17:10:38 localhost kernel: ... fixed-purpose events:   0
Jan 21 17:10:38 localhost kernel: ... event mask:             000000000000003f
Jan 21 17:10:38 localhost kernel: signal: max sigframe size: 1776
Jan 21 17:10:38 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 21 17:10:38 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 21 17:10:38 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 21 17:10:38 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 21 17:10:38 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 21 17:10:38 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 21 17:10:38 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 21 17:10:38 localhost kernel: node 0 deferred pages initialised in 12ms
Jan 21 17:10:38 localhost kernel: Memory: 7763868K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 21 17:10:38 localhost kernel: devtmpfs: initialized
Jan 21 17:10:38 localhost kernel: x86/mm: Memory block size: 128MB
Jan 21 17:10:38 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 21 17:10:38 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 21 17:10:38 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 21 17:10:38 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 21 17:10:38 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 21 17:10:38 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 21 17:10:38 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 21 17:10:38 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 21 17:10:38 localhost kernel: audit: type=2000 audit(1769015436.648:1): state=initialized audit_enabled=0 res=1
Jan 21 17:10:38 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 21 17:10:38 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 21 17:10:38 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 21 17:10:38 localhost kernel: cpuidle: using governor menu
Jan 21 17:10:38 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 21 17:10:38 localhost kernel: PCI: Using configuration type 1 for base access
Jan 21 17:10:38 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 21 17:10:38 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 21 17:10:38 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 21 17:10:38 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 21 17:10:38 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 21 17:10:38 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 21 17:10:38 localhost kernel: Demotion targets for Node 0: null
Jan 21 17:10:38 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 21 17:10:38 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 21 17:10:38 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 21 17:10:38 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 21 17:10:38 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 21 17:10:38 localhost kernel: ACPI: Interpreter enabled
Jan 21 17:10:38 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 21 17:10:38 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 21 17:10:38 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 21 17:10:38 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 21 17:10:38 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 21 17:10:38 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 21 17:10:38 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [3] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [4] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [5] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [6] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [7] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [8] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [9] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [10] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [11] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [12] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [13] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [14] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [15] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [16] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [17] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [18] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [19] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [20] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [21] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [22] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [23] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [24] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [25] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [26] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [27] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [28] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [29] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [30] registered
Jan 21 17:10:38 localhost kernel: acpiphp: Slot [31] registered
Jan 21 17:10:38 localhost kernel: PCI host bridge to bus 0000:00
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 21 17:10:38 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 21 17:10:38 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 21 17:10:38 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 21 17:10:38 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 21 17:10:38 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 21 17:10:38 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 21 17:10:38 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 21 17:10:38 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 21 17:10:38 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 21 17:10:38 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 21 17:10:38 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 21 17:10:38 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 21 17:10:38 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 21 17:10:38 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 21 17:10:38 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 21 17:10:38 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 21 17:10:38 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 21 17:10:38 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 21 17:10:38 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 21 17:10:38 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 21 17:10:38 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 21 17:10:38 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 21 17:10:38 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 21 17:10:38 localhost kernel: iommu: Default domain type: Translated
Jan 21 17:10:38 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 21 17:10:38 localhost kernel: SCSI subsystem initialized
Jan 21 17:10:38 localhost kernel: ACPI: bus type USB registered
Jan 21 17:10:38 localhost kernel: usbcore: registered new interface driver usbfs
Jan 21 17:10:38 localhost kernel: usbcore: registered new interface driver hub
Jan 21 17:10:38 localhost kernel: usbcore: registered new device driver usb
Jan 21 17:10:38 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 21 17:10:38 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 21 17:10:38 localhost kernel: PTP clock support registered
Jan 21 17:10:38 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 21 17:10:38 localhost kernel: NetLabel: Initializing
Jan 21 17:10:38 localhost kernel: NetLabel:  domain hash size = 128
Jan 21 17:10:38 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 21 17:10:38 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 21 17:10:38 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 21 17:10:38 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 21 17:10:38 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 21 17:10:38 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 21 17:10:38 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 21 17:10:38 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 21 17:10:38 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 21 17:10:38 localhost kernel: vgaarb: loaded
Jan 21 17:10:38 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 21 17:10:38 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 21 17:10:38 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 21 17:10:38 localhost kernel: pnp: PnP ACPI init
Jan 21 17:10:38 localhost kernel: pnp 00:03: [dma 2]
Jan 21 17:10:38 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 21 17:10:38 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 21 17:10:38 localhost kernel: NET: Registered PF_INET protocol family
Jan 21 17:10:38 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 21 17:10:38 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 21 17:10:38 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 21 17:10:38 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 21 17:10:38 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 21 17:10:38 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 21 17:10:38 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 21 17:10:38 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 21 17:10:38 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 21 17:10:38 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 21 17:10:38 localhost kernel: NET: Registered PF_XDP protocol family
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 21 17:10:38 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 21 17:10:38 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 21 17:10:38 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 21 17:10:38 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 96604 usecs
Jan 21 17:10:38 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 21 17:10:38 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 21 17:10:38 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 21 17:10:38 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 21 17:10:38 localhost kernel: ACPI: bus type thunderbolt registered
Jan 21 17:10:38 localhost kernel: Initialise system trusted keyrings
Jan 21 17:10:38 localhost kernel: Key type blacklist registered
Jan 21 17:10:38 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 21 17:10:38 localhost kernel: zbud: loaded
Jan 21 17:10:38 localhost kernel: integrity: Platform Keyring initialized
Jan 21 17:10:38 localhost kernel: integrity: Machine keyring initialized
Jan 21 17:10:38 localhost kernel: Freeing initrd memory: 87956K
Jan 21 17:10:38 localhost kernel: NET: Registered PF_ALG protocol family
Jan 21 17:10:38 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 21 17:10:38 localhost kernel: Key type asymmetric registered
Jan 21 17:10:38 localhost kernel: Asymmetric key parser 'x509' registered
Jan 21 17:10:38 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 21 17:10:38 localhost kernel: io scheduler mq-deadline registered
Jan 21 17:10:38 localhost kernel: io scheduler kyber registered
Jan 21 17:10:38 localhost kernel: io scheduler bfq registered
Jan 21 17:10:38 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 21 17:10:38 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 21 17:10:38 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 21 17:10:38 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 21 17:10:38 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 21 17:10:38 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 21 17:10:38 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 21 17:10:38 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 21 17:10:38 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 21 17:10:38 localhost kernel: Non-volatile memory driver v1.3
Jan 21 17:10:38 localhost kernel: rdac: device handler registered
Jan 21 17:10:38 localhost kernel: hp_sw: device handler registered
Jan 21 17:10:38 localhost kernel: emc: device handler registered
Jan 21 17:10:38 localhost kernel: alua: device handler registered
Jan 21 17:10:38 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 21 17:10:38 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 21 17:10:38 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 21 17:10:38 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 21 17:10:38 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 21 17:10:38 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 21 17:10:38 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 21 17:10:38 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 21 17:10:38 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 21 17:10:38 localhost kernel: hub 1-0:1.0: USB hub found
Jan 21 17:10:38 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 21 17:10:38 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 21 17:10:38 localhost kernel: usbserial: USB Serial support registered for generic
Jan 21 17:10:38 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 21 17:10:38 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 21 17:10:38 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 21 17:10:38 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 21 17:10:38 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 21 17:10:38 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 21 17:10:38 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 21 17:10:38 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 21 17:10:38 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-21T17:10:37 UTC (1769015437)
Jan 21 17:10:38 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 21 17:10:38 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 21 17:10:38 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 21 17:10:38 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 21 17:10:38 localhost kernel: usbcore: registered new interface driver usbhid
Jan 21 17:10:38 localhost kernel: usbhid: USB HID core driver
Jan 21 17:10:38 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 21 17:10:38 localhost kernel: Initializing XFRM netlink socket
Jan 21 17:10:38 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 21 17:10:38 localhost kernel: Segment Routing with IPv6
Jan 21 17:10:38 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 21 17:10:38 localhost kernel: mpls_gso: MPLS GSO support
Jan 21 17:10:38 localhost kernel: IPI shorthand broadcast: enabled
Jan 21 17:10:38 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 21 17:10:38 localhost kernel: AES CTR mode by8 optimization enabled
Jan 21 17:10:38 localhost kernel: sched_clock: Marking stable (1211007892, 151802769)->(1489154864, -126344203)
Jan 21 17:10:38 localhost kernel: registered taskstats version 1
Jan 21 17:10:38 localhost kernel: Loading compiled-in X.509 certificates
Jan 21 17:10:38 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 21 17:10:38 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 21 17:10:38 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 21 17:10:38 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 21 17:10:38 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 21 17:10:38 localhost kernel: Demotion targets for Node 0: null
Jan 21 17:10:38 localhost kernel: page_owner is disabled
Jan 21 17:10:38 localhost kernel: Key type .fscrypt registered
Jan 21 17:10:38 localhost kernel: Key type fscrypt-provisioning registered
Jan 21 17:10:38 localhost kernel: Key type big_key registered
Jan 21 17:10:38 localhost kernel: Key type encrypted registered
Jan 21 17:10:38 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 21 17:10:38 localhost kernel: Loading compiled-in module X.509 certificates
Jan 21 17:10:38 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 21 17:10:38 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 21 17:10:38 localhost kernel: ima: No architecture policies found
Jan 21 17:10:38 localhost kernel: evm: Initialising EVM extended attributes:
Jan 21 17:10:38 localhost kernel: evm: security.selinux
Jan 21 17:10:38 localhost kernel: evm: security.SMACK64 (disabled)
Jan 21 17:10:38 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 21 17:10:38 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 21 17:10:38 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 21 17:10:38 localhost kernel: evm: security.apparmor (disabled)
Jan 21 17:10:38 localhost kernel: evm: security.ima
Jan 21 17:10:38 localhost kernel: evm: security.capability
Jan 21 17:10:38 localhost kernel: evm: HMAC attrs: 0x1
Jan 21 17:10:38 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 21 17:10:38 localhost kernel: Running certificate verification RSA selftest
Jan 21 17:10:38 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 21 17:10:38 localhost kernel: Running certificate verification ECDSA selftest
Jan 21 17:10:38 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 21 17:10:38 localhost kernel: clk: Disabling unused clocks
Jan 21 17:10:38 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 21 17:10:38 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 21 17:10:38 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 21 17:10:38 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 21 17:10:38 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 21 17:10:38 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 21 17:10:38 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 21 17:10:38 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 21 17:10:38 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 21 17:10:38 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 21 17:10:38 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 21 17:10:38 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 21 17:10:38 localhost kernel: Run /init as init process
Jan 21 17:10:38 localhost kernel:   with arguments:
Jan 21 17:10:38 localhost kernel:     /init
Jan 21 17:10:38 localhost kernel:   with environment:
Jan 21 17:10:38 localhost kernel:     HOME=/
Jan 21 17:10:38 localhost kernel:     TERM=linux
Jan 21 17:10:38 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 21 17:10:38 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 21 17:10:38 localhost systemd[1]: Detected virtualization kvm.
Jan 21 17:10:38 localhost systemd[1]: Detected architecture x86-64.
Jan 21 17:10:38 localhost systemd[1]: Running in initrd.
Jan 21 17:10:38 localhost systemd[1]: No hostname configured, using default hostname.
Jan 21 17:10:38 localhost systemd[1]: Hostname set to <localhost>.
Jan 21 17:10:38 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 21 17:10:38 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 21 17:10:38 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 21 17:10:38 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 21 17:10:38 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 21 17:10:38 localhost systemd[1]: Reached target Local File Systems.
Jan 21 17:10:38 localhost systemd[1]: Reached target Path Units.
Jan 21 17:10:38 localhost systemd[1]: Reached target Slice Units.
Jan 21 17:10:38 localhost systemd[1]: Reached target Swaps.
Jan 21 17:10:38 localhost systemd[1]: Reached target Timer Units.
Jan 21 17:10:38 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 21 17:10:38 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 21 17:10:38 localhost systemd[1]: Listening on Journal Socket.
Jan 21 17:10:38 localhost systemd[1]: Listening on udev Control Socket.
Jan 21 17:10:38 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 21 17:10:38 localhost systemd[1]: Reached target Socket Units.
Jan 21 17:10:38 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 21 17:10:38 localhost systemd[1]: Starting Journal Service...
Jan 21 17:10:38 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 21 17:10:38 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 21 17:10:38 localhost systemd[1]: Starting Create System Users...
Jan 21 17:10:38 localhost systemd[1]: Starting Setup Virtual Console...
Jan 21 17:10:38 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 21 17:10:38 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 21 17:10:38 localhost systemd[1]: Finished Create System Users.
Jan 21 17:10:38 localhost systemd-journald[304]: Journal started
Jan 21 17:10:38 localhost systemd-journald[304]: Runtime Journal (/run/log/journal/3193fe30ac0a415eb2b255df2b1703a4) is 8.0M, max 153.6M, 145.6M free.
Jan 21 17:10:38 localhost systemd-sysusers[308]: Creating group 'users' with GID 100.
Jan 21 17:10:38 localhost systemd-sysusers[308]: Creating group 'dbus' with GID 81.
Jan 21 17:10:38 localhost systemd-sysusers[308]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 21 17:10:38 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 21 17:10:38 localhost systemd[1]: Started Journal Service.
Jan 21 17:10:38 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 21 17:10:38 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 21 17:10:38 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 21 17:10:38 localhost systemd[1]: Finished Setup Virtual Console.
Jan 21 17:10:38 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 21 17:10:38 localhost systemd[1]: Starting dracut cmdline hook...
Jan 21 17:10:38 localhost dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Jan 21 17:10:38 localhost dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 17:10:38 localhost systemd[1]: Finished dracut cmdline hook.
Jan 21 17:10:38 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 21 17:10:38 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 21 17:10:38 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 21 17:10:38 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 21 17:10:38 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 21 17:10:38 localhost kernel: RPC: Registered udp transport module.
Jan 21 17:10:38 localhost kernel: RPC: Registered tcp transport module.
Jan 21 17:10:38 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 21 17:10:38 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 21 17:10:38 localhost rpc.statd[441]: Version 2.5.4 starting
Jan 21 17:10:38 localhost rpc.statd[441]: Initializing NSM state
Jan 21 17:10:38 localhost rpc.idmapd[446]: Setting log level to 0
Jan 21 17:10:38 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 21 17:10:38 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 21 17:10:39 localhost systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Jan 21 17:10:39 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 21 17:10:39 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 21 17:10:39 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 21 17:10:39 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 21 17:10:39 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 21 17:10:39 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 21 17:10:39 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 21 17:10:39 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 17:10:39 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 21 17:10:39 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 21 17:10:39 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 21 17:10:39 localhost systemd[1]: Reached target Network.
Jan 21 17:10:39 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 21 17:10:39 localhost systemd[1]: Starting dracut initqueue hook...
Jan 21 17:10:39 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 21 17:10:39 localhost systemd[1]: Reached target System Initialization.
Jan 21 17:10:39 localhost systemd[1]: Reached target Basic System.
Jan 21 17:10:39 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 21 17:10:39 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 21 17:10:39 localhost kernel:  vda: vda1
Jan 21 17:10:39 localhost kernel: libata version 3.00 loaded.
Jan 21 17:10:39 localhost systemd-udevd[489]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:10:39 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 21 17:10:39 localhost kernel: scsi host0: ata_piix
Jan 21 17:10:39 localhost kernel: scsi host1: ata_piix
Jan 21 17:10:39 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 21 17:10:39 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 21 17:10:39 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 21 17:10:39 localhost systemd[1]: Reached target Initrd Root Device.
Jan 21 17:10:39 localhost kernel: ata1: found unknown device (class 0)
Jan 21 17:10:39 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 21 17:10:39 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 21 17:10:39 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 21 17:10:39 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 21 17:10:39 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 21 17:10:39 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 21 17:10:39 localhost systemd[1]: Finished dracut initqueue hook.
Jan 21 17:10:39 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 21 17:10:39 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 21 17:10:39 localhost systemd[1]: Reached target Remote File Systems.
Jan 21 17:10:39 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 21 17:10:39 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 21 17:10:39 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 21 17:10:39 localhost systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Jan 21 17:10:39 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 21 17:10:39 localhost systemd[1]: Mounting /sysroot...
Jan 21 17:10:40 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 21 17:10:40 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 21 17:10:40 localhost kernel: XFS (vda1): Ending clean mount
Jan 21 17:10:40 localhost systemd[1]: Mounted /sysroot.
Jan 21 17:10:40 localhost systemd[1]: Reached target Initrd Root File System.
Jan 21 17:10:40 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 21 17:10:40 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 21 17:10:40 localhost systemd[1]: Reached target Initrd File Systems.
Jan 21 17:10:40 localhost systemd[1]: Reached target Initrd Default Target.
Jan 21 17:10:40 localhost systemd[1]: Starting dracut mount hook...
Jan 21 17:10:40 localhost systemd[1]: Finished dracut mount hook.
Jan 21 17:10:40 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 21 17:10:40 localhost rpc.idmapd[446]: exiting on signal 15
Jan 21 17:10:40 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 21 17:10:40 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 21 17:10:40 localhost systemd[1]: Stopped target Network.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Timer Units.
Jan 21 17:10:40 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 21 17:10:40 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Basic System.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Path Units.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Remote File Systems.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Slice Units.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Socket Units.
Jan 21 17:10:40 localhost systemd[1]: Stopped target System Initialization.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Local File Systems.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Swaps.
Jan 21 17:10:40 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped dracut mount hook.
Jan 21 17:10:40 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 21 17:10:40 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 21 17:10:40 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 21 17:10:40 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 21 17:10:40 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 21 17:10:40 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 21 17:10:40 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 21 17:10:40 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 21 17:10:40 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 21 17:10:40 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 21 17:10:40 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 21 17:10:40 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 21 17:10:40 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Closed udev Control Socket.
Jan 21 17:10:40 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Closed udev Kernel Socket.
Jan 21 17:10:40 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 21 17:10:40 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 21 17:10:40 localhost systemd[1]: Starting Cleanup udev Database...
Jan 21 17:10:40 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 21 17:10:40 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 21 17:10:40 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Stopped Create System Users.
Jan 21 17:10:40 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 21 17:10:40 localhost systemd[1]: Finished Cleanup udev Database.
Jan 21 17:10:40 localhost systemd[1]: Reached target Switch Root.
Jan 21 17:10:40 localhost systemd[1]: Starting Switch Root...
Jan 21 17:10:40 localhost systemd[1]: Switching root.
Jan 21 17:10:40 localhost systemd-journald[304]: Journal stopped
Jan 21 17:10:41 localhost systemd-journald[304]: Received SIGTERM from PID 1 (systemd).
Jan 21 17:10:41 localhost kernel: audit: type=1404 audit(1769015440.717:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 21 17:10:41 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:10:41 localhost kernel: SELinux:  policy capability open_perms=1
Jan 21 17:10:41 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:10:41 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:10:41 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:10:41 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:10:41 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:10:41 localhost kernel: audit: type=1403 audit(1769015440.839:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 21 17:10:41 localhost systemd[1]: Successfully loaded SELinux policy in 124.783ms.
Jan 21 17:10:41 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.983ms.
Jan 21 17:10:41 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 21 17:10:41 localhost systemd[1]: Detected virtualization kvm.
Jan 21 17:10:41 localhost systemd[1]: Detected architecture x86-64.
Jan 21 17:10:41 localhost systemd-rc-local-generator[635]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:10:41 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 21 17:10:41 localhost systemd[1]: Stopped Switch Root.
Jan 21 17:10:41 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 21 17:10:41 localhost systemd[1]: Created slice Slice /system/getty.
Jan 21 17:10:41 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 21 17:10:41 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 21 17:10:41 localhost systemd[1]: Created slice User and Session Slice.
Jan 21 17:10:41 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 21 17:10:41 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 21 17:10:41 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 21 17:10:41 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 21 17:10:41 localhost systemd[1]: Stopped target Switch Root.
Jan 21 17:10:41 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 21 17:10:41 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 21 17:10:41 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 21 17:10:41 localhost systemd[1]: Reached target Path Units.
Jan 21 17:10:41 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 21 17:10:41 localhost systemd[1]: Reached target Slice Units.
Jan 21 17:10:41 localhost systemd[1]: Reached target Swaps.
Jan 21 17:10:41 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 21 17:10:41 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 21 17:10:41 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 21 17:10:41 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 21 17:10:41 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 21 17:10:41 localhost systemd[1]: Listening on udev Control Socket.
Jan 21 17:10:41 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 21 17:10:41 localhost systemd[1]: Mounting Huge Pages File System...
Jan 21 17:10:41 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 21 17:10:41 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 21 17:10:41 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 21 17:10:41 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 21 17:10:41 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 21 17:10:41 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 21 17:10:41 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 21 17:10:41 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 21 17:10:41 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 21 17:10:41 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 21 17:10:41 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 21 17:10:41 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 21 17:10:41 localhost systemd[1]: Stopped Journal Service.
Jan 21 17:10:41 localhost kernel: fuse: init (API version 7.37)
Jan 21 17:10:41 localhost systemd[1]: Starting Journal Service...
Jan 21 17:10:41 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 21 17:10:41 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 21 17:10:41 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 17:10:41 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 21 17:10:41 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 21 17:10:41 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 21 17:10:41 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 21 17:10:41 localhost systemd[1]: Mounted Huge Pages File System.
Jan 21 17:10:41 localhost systemd-journald[676]: Journal started
Jan 21 17:10:41 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 21 17:10:41 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 21 17:10:41 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 21 17:10:41 localhost systemd[1]: Started Journal Service.
Jan 21 17:10:41 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 21 17:10:41 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 21 17:10:41 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 21 17:10:41 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 21 17:10:41 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 21 17:10:41 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 17:10:41 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 21 17:10:41 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 21 17:10:41 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 21 17:10:41 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 21 17:10:41 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 21 17:10:41 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 21 17:10:41 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 21 17:10:41 localhost kernel: ACPI: bus type drm_connector registered
Jan 21 17:10:41 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 21 17:10:41 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 21 17:10:41 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 21 17:10:41 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 21 17:10:41 localhost systemd[1]: Mounting FUSE Control File System...
Jan 21 17:10:41 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 21 17:10:41 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 21 17:10:41 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 21 17:10:41 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 21 17:10:41 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 21 17:10:41 localhost systemd[1]: Starting Create System Users...
Jan 21 17:10:41 localhost systemd[1]: Mounted FUSE Control File System.
Jan 21 17:10:41 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 21 17:10:41 localhost systemd-journald[676]: Received client request to flush runtime journal.
Jan 21 17:10:41 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 21 17:10:41 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 21 17:10:41 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 21 17:10:41 localhost systemd[1]: Finished Create System Users.
Jan 21 17:10:41 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 21 17:10:41 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 21 17:10:41 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 21 17:10:41 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 21 17:10:41 localhost systemd[1]: Reached target Local File Systems.
Jan 21 17:10:41 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 21 17:10:41 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 21 17:10:41 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 21 17:10:41 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 21 17:10:41 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 21 17:10:41 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 21 17:10:41 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 21 17:10:41 localhost bootctl[694]: Couldn't find EFI system partition, skipping.
Jan 21 17:10:41 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 21 17:10:41 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 21 17:10:41 localhost systemd[1]: Starting Security Auditing Service...
Jan 21 17:10:41 localhost systemd[1]: Starting RPC Bind...
Jan 21 17:10:41 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 21 17:10:41 localhost auditd[700]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 21 17:10:41 localhost auditd[700]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 21 17:10:41 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 21 17:10:41 localhost systemd[1]: Started RPC Bind.
Jan 21 17:10:41 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 21 17:10:41 localhost augenrules[705]: /sbin/augenrules: No change
Jan 21 17:10:41 localhost augenrules[720]: No rules
Jan 21 17:10:41 localhost augenrules[720]: enabled 1
Jan 21 17:10:41 localhost augenrules[720]: failure 1
Jan 21 17:10:41 localhost augenrules[720]: pid 700
Jan 21 17:10:41 localhost augenrules[720]: rate_limit 0
Jan 21 17:10:41 localhost augenrules[720]: backlog_limit 8192
Jan 21 17:10:41 localhost augenrules[720]: lost 0
Jan 21 17:10:41 localhost augenrules[720]: backlog 0
Jan 21 17:10:41 localhost augenrules[720]: backlog_wait_time 60000
Jan 21 17:10:41 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 21 17:10:41 localhost augenrules[720]: enabled 1
Jan 21 17:10:41 localhost augenrules[720]: failure 1
Jan 21 17:10:41 localhost augenrules[720]: pid 700
Jan 21 17:10:41 localhost augenrules[720]: rate_limit 0
Jan 21 17:10:41 localhost augenrules[720]: backlog_limit 8192
Jan 21 17:10:41 localhost augenrules[720]: lost 0
Jan 21 17:10:41 localhost augenrules[720]: backlog 0
Jan 21 17:10:41 localhost augenrules[720]: backlog_wait_time 60000
Jan 21 17:10:41 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 21 17:10:41 localhost augenrules[720]: enabled 1
Jan 21 17:10:41 localhost augenrules[720]: failure 1
Jan 21 17:10:41 localhost augenrules[720]: pid 700
Jan 21 17:10:41 localhost augenrules[720]: rate_limit 0
Jan 21 17:10:41 localhost augenrules[720]: backlog_limit 8192
Jan 21 17:10:41 localhost augenrules[720]: lost 0
Jan 21 17:10:41 localhost augenrules[720]: backlog 0
Jan 21 17:10:41 localhost augenrules[720]: backlog_wait_time 60000
Jan 21 17:10:41 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 21 17:10:41 localhost systemd[1]: Started Security Auditing Service.
Jan 21 17:10:41 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 21 17:10:41 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 21 17:10:42 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 21 17:10:42 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 21 17:10:42 localhost systemd[1]: Starting Update is Completed...
Jan 21 17:10:42 localhost systemd[1]: Finished Update is Completed.
Jan 21 17:10:42 localhost systemd-udevd[728]: Using default interface naming scheme 'rhel-9.0'.
Jan 21 17:10:42 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 21 17:10:42 localhost systemd[1]: Reached target System Initialization.
Jan 21 17:10:42 localhost systemd[1]: Started dnf makecache --timer.
Jan 21 17:10:42 localhost systemd[1]: Started Daily rotation of log files.
Jan 21 17:10:42 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 21 17:10:42 localhost systemd[1]: Reached target Timer Units.
Jan 21 17:10:42 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 21 17:10:42 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 21 17:10:42 localhost systemd[1]: Reached target Socket Units.
Jan 21 17:10:42 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 21 17:10:42 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 17:10:42 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 21 17:10:42 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 21 17:10:42 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 17:10:42 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 21 17:10:42 localhost systemd-udevd[747]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:10:42 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 21 17:10:42 localhost systemd[1]: Reached target Basic System.
Jan 21 17:10:42 localhost dbus-broker-lau[755]: Ready
Jan 21 17:10:42 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 21 17:10:42 localhost systemd[1]: Starting NTP client/server...
Jan 21 17:10:42 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 21 17:10:42 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 21 17:10:42 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 21 17:10:42 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 21 17:10:42 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 21 17:10:42 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 21 17:10:42 localhost systemd[1]: Started irqbalance daemon.
Jan 21 17:10:42 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 21 17:10:42 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 17:10:42 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 17:10:42 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 17:10:42 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 21 17:10:42 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 21 17:10:42 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 21 17:10:42 localhost systemd[1]: Starting User Login Management...
Jan 21 17:10:42 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 21 17:10:42 localhost chronyd[794]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 21 17:10:42 localhost chronyd[794]: Loaded 0 symmetric keys
Jan 21 17:10:42 localhost chronyd[794]: Using right/UTC timezone to obtain leap second data
Jan 21 17:10:42 localhost chronyd[794]: Loaded seccomp filter (level 2)
Jan 21 17:10:42 localhost systemd[1]: Started NTP client/server.
Jan 21 17:10:42 localhost systemd-logind[782]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 21 17:10:42 localhost systemd-logind[782]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 21 17:10:42 localhost systemd-logind[782]: New seat seat0.
Jan 21 17:10:42 localhost systemd[1]: Started User Login Management.
Jan 21 17:10:42 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 21 17:10:42 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 21 17:10:42 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 21 17:10:42 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 21 17:10:42 localhost kernel: Console: switching to colour dummy device 80x25
Jan 21 17:10:42 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 21 17:10:42 localhost kernel: [drm] features: -context_init
Jan 21 17:10:42 localhost kernel: [drm] number of scanouts: 1
Jan 21 17:10:42 localhost kernel: [drm] number of cap sets: 0
Jan 21 17:10:42 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 21 17:10:42 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 21 17:10:42 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 21 17:10:42 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 21 17:10:42 localhost kernel: kvm_amd: TSC scaling supported
Jan 21 17:10:42 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 21 17:10:42 localhost kernel: kvm_amd: Nested Paging enabled
Jan 21 17:10:42 localhost kernel: kvm_amd: LBR virtualization supported
Jan 21 17:10:42 localhost iptables.init[776]: iptables: Applying firewall rules: [  OK  ]
Jan 21 17:10:42 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 21 17:10:42 localhost cloud-init[836]: Cloud-init v. 24.4-8.el9 running 'init-local' at Wed, 21 Jan 2026 17:10:42 +0000. Up 6.50 seconds.
Jan 21 17:10:43 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 21 17:10:43 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 21 17:10:43 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpvcu_xywi.mount: Deactivated successfully.
Jan 21 17:10:43 localhost systemd[1]: Starting Hostname Service...
Jan 21 17:10:43 localhost systemd[1]: Started Hostname Service.
Jan 21 17:10:43 np0005590981.novalocal systemd-hostnamed[850]: Hostname set to <np0005590981.novalocal> (static)
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Reached target Preparation for Network.
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Starting Network Manager...
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4101] NetworkManager (version 1.54.3-2.el9) is starting... (boot:23f14126-aa07-49ac-87bf-97b3c4c8c82d)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4107] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4184] manager[0x5599e6999000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4237] hostname: hostname: using hostnamed
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4237] hostname: static hostname changed from (none) to "np0005590981.novalocal"
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4241] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4336] manager[0x5599e6999000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4336] manager[0x5599e6999000]: rfkill: WWAN hardware radio set enabled
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4398] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4399] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4400] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4401] manager: Networking is enabled by state file
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4403] settings: Loaded settings plugin: keyfile (internal)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4416] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4433] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4443] dhcp: init: Using DHCP client 'internal'
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4447] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4458] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4465] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4473] device (lo): Activation: starting connection 'lo' (5c2c2fbe-fb9d-4e67-b102-f58e14318d32)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4482] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4485] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4513] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4517] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4520] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4523] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4525] device (eth0): carrier: link connected
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4529] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4534] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4539] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4543] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4545] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4547] manager: NetworkManager state is now CONNECTING
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4549] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4555] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4559] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4610] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4620] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4638] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Started Network Manager.
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Reached target Network.
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4882] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4885] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4887] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4893] device (lo): Activation: successful, device activated.
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4899] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4903] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4905] device (eth0): Activation: successful, device activated.
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4909] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 17:10:43 np0005590981.novalocal NetworkManager[855]: <info>  [1769015443.4911] manager: startup complete
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Reached target NFS client services.
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Reached target Remote File Systems.
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 21 17:10:43 np0005590981.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: Cloud-init v. 24.4-8.el9 running 'init' at Wed, 21 Jan 2026 17:10:43 +0000. Up 7.44 seconds.
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: |  eth0  | True |         38.102.83.50         | 255.255.255.0 | global | fa:16:3e:9c:bc:d1 |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: |  eth0  | True | fe80::f816:3eff:fe9c:bcd1/64 |       .       |  link  | fa:16:3e:9c:bc:d1 |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 21 17:10:43 np0005590981.novalocal cloud-init[918]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 17:10:44 np0005590981.novalocal useradd[984]: new group: name=cloud-user, GID=1001
Jan 21 17:10:44 np0005590981.novalocal useradd[984]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 21 17:10:44 np0005590981.novalocal useradd[984]: add 'cloud-user' to group 'adm'
Jan 21 17:10:44 np0005590981.novalocal useradd[984]: add 'cloud-user' to group 'systemd-journal'
Jan 21 17:10:44 np0005590981.novalocal useradd[984]: add 'cloud-user' to shadow group 'adm'
Jan 21 17:10:44 np0005590981.novalocal useradd[984]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: Generating public/private rsa key pair.
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: The key fingerprint is:
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: SHA256:JnuBkp7Tv+KD4m0tJsqiwdD7ogg1lnB9J71LAAREYFM root@np0005590981.novalocal
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: The key's randomart image is:
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: +---[RSA 3072]----+
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |.*=Eo            |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |. .. . .         |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |. . . + o        |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: | + . o = .       |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |. * o o S        |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |oo + + = o       |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |o.. +oo o        |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |=.+o=.+o         |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |B=.*oo.oo.       |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: +----[SHA256]-----+
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: Generating public/private ecdsa key pair.
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: The key fingerprint is:
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: SHA256:F1b7jbva7qMH/w9Vo6eee27uwSBZLbM43A4cSViQZIA root@np0005590981.novalocal
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: The key's randomart image is:
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: +---[ECDSA 256]---+
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |      ..o+=o.    |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |     E  .o.....  |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |          oo.+ o.|
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |         .o.*.=oo|
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |        S .O =o.o|
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |         .  * =o |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |             =oo |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |            ..==.|
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |            oX%*+|
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: +----[SHA256]-----+
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: Generating public/private ed25519 key pair.
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: The key fingerprint is:
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: SHA256:Ka7g3IIskYXPKJOPopmLqQZ/uQtB0kCa7YK+bqbjSm0 root@np0005590981.novalocal
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: The key's randomart image is:
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: +--[ED25519 256]--+
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |o.               |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |.=               |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |+.+              |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |o+.      .       |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |oBo   . S        |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |X.+. . .         |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |+XoE ..          |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |O@*+o.           |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: |^*oo=o           |
Jan 21 17:10:45 np0005590981.novalocal cloud-init[918]: +----[SHA256]-----+
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Reached target Network is Online.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Starting System Logging Service...
Jan 21 17:10:45 np0005590981.novalocal sm-notify[1001]: Version 2.5.4 starting
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Starting Permit User Sessions...
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 21 17:10:45 np0005590981.novalocal sshd[1003]: Server listening on 0.0.0.0 port 22.
Jan 21 17:10:45 np0005590981.novalocal sshd[1003]: Server listening on :: port 22.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Finished Permit User Sessions.
Jan 21 17:10:45 np0005590981.novalocal rsyslogd[1002]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1002" x-info="https://www.rsyslog.com"] start
Jan 21 17:10:45 np0005590981.novalocal rsyslogd[1002]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Started System Logging Service.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Started Command Scheduler.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Started Getty on tty1.
Jan 21 17:10:45 np0005590981.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Jan 21 17:10:45 np0005590981.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 21 17:10:45 np0005590981.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 73% if used.)
Jan 21 17:10:45 np0005590981.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Reached target Login Prompts.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Reached target Multi-User System.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 21 17:10:45 np0005590981.novalocal rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 17:10:45 np0005590981.novalocal kdumpctl[1015]: kdump: No kdump initial ramdisk found.
Jan 21 17:10:45 np0005590981.novalocal kdumpctl[1015]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 21 17:10:45 np0005590981.novalocal cloud-init[1092]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Wed, 21 Jan 2026 17:10:45 +0000. Up 9.33 seconds.
Jan 21 17:10:45 np0005590981.novalocal sshd-session[1105]: Connection reset by 38.102.83.114 port 43414 [preauth]
Jan 21 17:10:45 np0005590981.novalocal sshd-session[1122]: Unable to negotiate with 38.102.83.114 port 43424: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 21 17:10:45 np0005590981.novalocal sshd-session[1136]: Unable to negotiate with 38.102.83.114 port 43444: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 21 17:10:45 np0005590981.novalocal sshd-session[1144]: Unable to negotiate with 38.102.83.114 port 43446: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 21 17:10:45 np0005590981.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 21 17:10:45 np0005590981.novalocal sshd-session[1162]: Connection reset by 38.102.83.114 port 43456 [preauth]
Jan 21 17:10:45 np0005590981.novalocal sshd-session[1172]: Unable to negotiate with 38.102.83.114 port 43472: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 21 17:10:45 np0005590981.novalocal sshd-session[1131]: Connection closed by 38.102.83.114 port 43434 [preauth]
Jan 21 17:10:45 np0005590981.novalocal sshd-session[1179]: Unable to negotiate with 38.102.83.114 port 43474: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 21 17:10:45 np0005590981.novalocal sshd-session[1150]: Connection closed by 38.102.83.114 port 43450 [preauth]
Jan 21 17:10:46 np0005590981.novalocal dracut[1282]: dracut-057-102.git20250818.el9
Jan 21 17:10:46 np0005590981.novalocal cloud-init[1298]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Wed, 21 Jan 2026 17:10:46 +0000. Up 9.73 seconds.
Jan 21 17:10:46 np0005590981.novalocal cloud-init[1300]: #############################################################
Jan 21 17:10:46 np0005590981.novalocal cloud-init[1301]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 21 17:10:46 np0005590981.novalocal cloud-init[1303]: 256 SHA256:F1b7jbva7qMH/w9Vo6eee27uwSBZLbM43A4cSViQZIA root@np0005590981.novalocal (ECDSA)
Jan 21 17:10:46 np0005590981.novalocal cloud-init[1305]: 256 SHA256:Ka7g3IIskYXPKJOPopmLqQZ/uQtB0kCa7YK+bqbjSm0 root@np0005590981.novalocal (ED25519)
Jan 21 17:10:46 np0005590981.novalocal cloud-init[1307]: 3072 SHA256:JnuBkp7Tv+KD4m0tJsqiwdD7ogg1lnB9J71LAAREYFM root@np0005590981.novalocal (RSA)
Jan 21 17:10:46 np0005590981.novalocal cloud-init[1308]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 21 17:10:46 np0005590981.novalocal cloud-init[1310]: #############################################################
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 21 17:10:46 np0005590981.novalocal cloud-init[1298]: Cloud-init v. 24.4-8.el9 finished at Wed, 21 Jan 2026 17:10:46 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.93 seconds
Jan 21 17:10:46 np0005590981.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 21 17:10:46 np0005590981.novalocal systemd[1]: Reached target Cloud-init target.
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 21 17:10:46 np0005590981.novalocal dracut[1284]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: memstrack is not available
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: memstrack is not available
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 21 17:10:47 np0005590981.novalocal dracut[1284]: *** Including module: systemd ***
Jan 21 17:10:48 np0005590981.novalocal dracut[1284]: *** Including module: fips ***
Jan 21 17:10:48 np0005590981.novalocal dracut[1284]: *** Including module: systemd-initrd ***
Jan 21 17:10:48 np0005590981.novalocal chronyd[794]: Selected source 162.159.200.1 (2.centos.pool.ntp.org)
Jan 21 17:10:48 np0005590981.novalocal chronyd[794]: System clock TAI offset set to 37 seconds
Jan 21 17:10:48 np0005590981.novalocal dracut[1284]: *** Including module: i18n ***
Jan 21 17:10:48 np0005590981.novalocal dracut[1284]: *** Including module: drm ***
Jan 21 17:10:48 np0005590981.novalocal dracut[1284]: *** Including module: prefixdevname ***
Jan 21 17:10:48 np0005590981.novalocal dracut[1284]: *** Including module: kernel-modules ***
Jan 21 17:10:49 np0005590981.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]: *** Including module: kernel-modules-extra ***
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]: *** Including module: qemu ***
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]: *** Including module: fstab-sys ***
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]: *** Including module: rootfs-block ***
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]: *** Including module: terminfo ***
Jan 21 17:10:49 np0005590981.novalocal dracut[1284]: *** Including module: udev-rules ***
Jan 21 17:10:50 np0005590981.novalocal dracut[1284]: Skipping udev rule: 91-permissions.rules
Jan 21 17:10:50 np0005590981.novalocal dracut[1284]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 21 17:10:50 np0005590981.novalocal dracut[1284]: *** Including module: virtiofs ***
Jan 21 17:10:50 np0005590981.novalocal dracut[1284]: *** Including module: dracut-systemd ***
Jan 21 17:10:50 np0005590981.novalocal dracut[1284]: *** Including module: usrmount ***
Jan 21 17:10:50 np0005590981.novalocal dracut[1284]: *** Including module: base ***
Jan 21 17:10:50 np0005590981.novalocal dracut[1284]: *** Including module: fs-lib ***
Jan 21 17:10:50 np0005590981.novalocal dracut[1284]: *** Including module: kdumpbase ***
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:   microcode_ctl module: mangling fw_dir
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]: *** Including module: openssl ***
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]: *** Including module: shutdown ***
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]: *** Including module: squash ***
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]: *** Including modules done ***
Jan 21 17:10:51 np0005590981.novalocal dracut[1284]: *** Installing kernel module dependencies ***
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: IRQ 35 affinity is now unmanaged
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: IRQ 33 affinity is now unmanaged
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: IRQ 31 affinity is now unmanaged
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: IRQ 28 affinity is now unmanaged
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: IRQ 34 affinity is now unmanaged
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: IRQ 32 affinity is now unmanaged
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: IRQ 30 affinity is now unmanaged
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 21 17:10:52 np0005590981.novalocal irqbalance[777]: IRQ 29 affinity is now unmanaged
Jan 21 17:10:52 np0005590981.novalocal dracut[1284]: *** Installing kernel module dependencies done ***
Jan 21 17:10:52 np0005590981.novalocal dracut[1284]: *** Resolving executable dependencies ***
Jan 21 17:10:53 np0005590981.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:10:53 np0005590981.novalocal dracut[1284]: *** Resolving executable dependencies done ***
Jan 21 17:10:53 np0005590981.novalocal dracut[1284]: *** Generating early-microcode cpio image ***
Jan 21 17:10:53 np0005590981.novalocal dracut[1284]: *** Store current command line parameters ***
Jan 21 17:10:53 np0005590981.novalocal dracut[1284]: Stored kernel commandline:
Jan 21 17:10:53 np0005590981.novalocal dracut[1284]: No dracut internal kernel commandline stored in the initramfs
Jan 21 17:10:54 np0005590981.novalocal dracut[1284]: *** Install squash loader ***
Jan 21 17:10:55 np0005590981.novalocal dracut[1284]: *** Squashing the files inside the initramfs ***
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: *** Squashing the files inside the initramfs done ***
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: *** Hardlinking files ***
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: Mode:           real
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: Files:          50
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: Linked:         0 files
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: Compared:       0 xattrs
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: Compared:       0 files
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: Saved:          0 B
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: Duration:       0.000530 seconds
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: *** Hardlinking files done ***
Jan 21 17:10:56 np0005590981.novalocal dracut[1284]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 21 17:10:57 np0005590981.novalocal kdumpctl[1015]: kdump: kexec: loaded kdump kernel
Jan 21 17:10:57 np0005590981.novalocal kdumpctl[1015]: kdump: Starting kdump: [OK]
Jan 21 17:10:57 np0005590981.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 21 17:10:57 np0005590981.novalocal systemd[1]: Startup finished in 1.607s (kernel) + 2.770s (initrd) + 16.302s (userspace) = 20.680s.
Jan 21 17:11:13 np0005590981.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 17:11:21 np0005590981.novalocal sshd-session[4301]: Invalid user fluxos from 64.227.98.100 port 33970
Jan 21 17:11:21 np0005590981.novalocal sshd-session[4301]: Connection closed by invalid user fluxos 64.227.98.100 port 33970 [preauth]
Jan 21 17:12:03 np0005590981.novalocal sshd-session[4304]: error: kex_exchange_identification: read: Connection reset by peer
Jan 21 17:12:03 np0005590981.novalocal sshd-session[4304]: Connection reset by 176.120.22.52 port 55262
Jan 21 17:12:20 np0005590981.novalocal sshd-session[4305]: Accepted publickey for zuul from 38.102.83.114 port 59028 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 21 17:12:20 np0005590981.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 21 17:12:20 np0005590981.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 21 17:12:20 np0005590981.novalocal systemd-logind[782]: New session 1 of user zuul.
Jan 21 17:12:20 np0005590981.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 21 17:12:20 np0005590981.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Queued start job for default target Main User Target.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Created slice User Application Slice.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Reached target Paths.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Reached target Timers.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Starting D-Bus User Message Bus Socket...
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Starting Create User's Volatile Files and Directories...
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Listening on D-Bus User Message Bus Socket.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Reached target Sockets.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Finished Create User's Volatile Files and Directories.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Reached target Basic System.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Reached target Main User Target.
Jan 21 17:12:20 np0005590981.novalocal systemd[4309]: Startup finished in 118ms.
Jan 21 17:12:20 np0005590981.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 21 17:12:20 np0005590981.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 21 17:12:20 np0005590981.novalocal sshd-session[4305]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:12:21 np0005590981.novalocal python3[4393]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:12:23 np0005590981.novalocal python3[4421]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:12:31 np0005590981.novalocal python3[4479]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:12:32 np0005590981.novalocal python3[4519]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 21 17:12:35 np0005590981.novalocal python3[4545]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCeuhb3h2SueVuN+nmMhuOwkUIUr/Z95RZpCGQcWQAg3bpfrlsODsqqTEL85dNwX7EAA8Ur/d2ZQ5fDWpuBHpEa4LdX0YYM3FwMK8VHqO3krm3iHbFb0hbP9UeRGfVh/kc4DopCn3dADOfM6r0N6wTENvHdBZR25ypHO22CbvhfC2Y7wXIuw8FEIEfFc/akbI5N4WZw9WJo2vuV2FQypMHacljm/2cG+Gt/eA6XgtWzd8rMxEWZXGStt2erlKQP1+NKxJ5BvsbdyftMKGfuQutTFWKoo9FgwgX2IlLwaT1ey2aof7g1WWd9B7mod9xlJUXCbwpo/IbV+9MtvapA1Ynf53s1uUbRvry9XXYgbbnkiBLUNS4VphzcSDCCCxvxWGHy1JO1rLx03H9FU6JyAQDr459ITFdPiK2wqLif/c1VCUkH1JzLGuHctO0nghMLyEJLiqpOMWE6NvXZ04iMWdgwYVl7DdCNRn/4zcn8A6kSoebQPhcYcyiyGk3qOe2xGWk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:35 np0005590981.novalocal python3[4569]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:35 np0005590981.novalocal python3[4668]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:12:36 np0005590981.novalocal python3[4739]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769015555.6084878-229-43238437053848/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=c2618586e2344b0db12a08e6a825ad97_id_rsa follow=False checksum=987948de0f7fcd2f3282da40f86f70b05a32dd5d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:37 np0005590981.novalocal python3[4862]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:12:37 np0005590981.novalocal python3[4933]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769015557.1628864-273-68387701723689/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=c2618586e2344b0db12a08e6a825ad97_id_rsa.pub follow=False checksum=783f13dde8098aabe44c2a58251731d1bbf59533 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:39 np0005590981.novalocal python3[4981]: ansible-ping Invoked with data=pong
Jan 21 17:12:40 np0005590981.novalocal python3[5005]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:12:42 np0005590981.novalocal python3[5063]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 21 17:12:43 np0005590981.novalocal python3[5095]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:44 np0005590981.novalocal python3[5119]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:44 np0005590981.novalocal python3[5143]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:44 np0005590981.novalocal python3[5167]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:44 np0005590981.novalocal python3[5191]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:45 np0005590981.novalocal python3[5215]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:46 np0005590981.novalocal sudo[5239]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewtrgknsbqapcejweqgownbzeyhgybwy ; /usr/bin/python3'
Jan 21 17:12:46 np0005590981.novalocal sudo[5239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:12:46 np0005590981.novalocal python3[5241]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:47 np0005590981.novalocal sudo[5239]: pam_unix(sudo:session): session closed for user root
Jan 21 17:12:47 np0005590981.novalocal sudo[5317]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsvvkbtjpfyceounpjsaictjpcejfayh ; /usr/bin/python3'
Jan 21 17:12:47 np0005590981.novalocal sudo[5317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:12:47 np0005590981.novalocal python3[5319]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:12:47 np0005590981.novalocal sudo[5317]: pam_unix(sudo:session): session closed for user root
Jan 21 17:12:48 np0005590981.novalocal sudo[5390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkzgaszjlkhlpogrqleqxhtrigyxuyil ; /usr/bin/python3'
Jan 21 17:12:48 np0005590981.novalocal sudo[5390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:12:48 np0005590981.novalocal python3[5392]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769015567.1490777-26-132891974727592/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:48 np0005590981.novalocal sudo[5390]: pam_unix(sudo:session): session closed for user root
Jan 21 17:12:48 np0005590981.novalocal python3[5440]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:49 np0005590981.novalocal python3[5464]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:49 np0005590981.novalocal python3[5488]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:49 np0005590981.novalocal python3[5512]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:49 np0005590981.novalocal python3[5536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:50 np0005590981.novalocal python3[5560]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:50 np0005590981.novalocal python3[5584]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:50 np0005590981.novalocal python3[5608]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:51 np0005590981.novalocal python3[5632]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:51 np0005590981.novalocal python3[5656]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:51 np0005590981.novalocal python3[5680]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:51 np0005590981.novalocal python3[5704]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:52 np0005590981.novalocal python3[5728]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:52 np0005590981.novalocal python3[5752]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:52 np0005590981.novalocal python3[5776]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:53 np0005590981.novalocal python3[5800]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:53 np0005590981.novalocal python3[5824]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:53 np0005590981.novalocal python3[5848]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:53 np0005590981.novalocal python3[5872]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:54 np0005590981.novalocal python3[5896]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:54 np0005590981.novalocal python3[5920]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:54 np0005590981.novalocal python3[5944]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:55 np0005590981.novalocal python3[5968]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:55 np0005590981.novalocal python3[5992]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:55 np0005590981.novalocal python3[6016]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:56 np0005590981.novalocal python3[6040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:12:58 np0005590981.novalocal sudo[6064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxtcuthrezusekjzoswjjsymwsxsialk ; /usr/bin/python3'
Jan 21 17:12:58 np0005590981.novalocal sudo[6064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:12:58 np0005590981.novalocal python3[6066]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 21 17:12:58 np0005590981.novalocal systemd[1]: Starting Time & Date Service...
Jan 21 17:12:58 np0005590981.novalocal systemd[1]: Started Time & Date Service.
Jan 21 17:12:58 np0005590981.novalocal systemd-timedated[6068]: Changed time zone to 'UTC' (UTC).
Jan 21 17:12:58 np0005590981.novalocal sudo[6064]: pam_unix(sudo:session): session closed for user root
Jan 21 17:12:58 np0005590981.novalocal sudo[6095]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmrpubtubmzsybffhdcaarqnjuvfhufu ; /usr/bin/python3'
Jan 21 17:12:58 np0005590981.novalocal sudo[6095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:12:58 np0005590981.novalocal python3[6097]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:12:58 np0005590981.novalocal sudo[6095]: pam_unix(sudo:session): session closed for user root
Jan 21 17:12:59 np0005590981.novalocal python3[6173]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:12:59 np0005590981.novalocal python3[6244]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769015579.1456256-202-198380076002082/source _original_basename=tmpxk6by2wn follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:13:00 np0005590981.novalocal python3[6344]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:13:00 np0005590981.novalocal python3[6415]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769015580.1475585-242-105474634157394/source _original_basename=tmpnhv8lzi6 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:13:01 np0005590981.novalocal sudo[6515]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckbtxuwjgiwzihixpvpojdqslhjyufos ; /usr/bin/python3'
Jan 21 17:13:01 np0005590981.novalocal sudo[6515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:13:01 np0005590981.novalocal python3[6517]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:13:01 np0005590981.novalocal sudo[6515]: pam_unix(sudo:session): session closed for user root
Jan 21 17:13:01 np0005590981.novalocal sudo[6588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxeklgdcgxncrzfvfhrnvbpjvforrvao ; /usr/bin/python3'
Jan 21 17:13:01 np0005590981.novalocal sudo[6588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:13:01 np0005590981.novalocal python3[6590]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769015581.2837183-306-240816676242142/source _original_basename=tmp9iwenlqj follow=False checksum=9dc2039529c0f35ddba9b5f747501467f5135778 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:13:01 np0005590981.novalocal sudo[6588]: pam_unix(sudo:session): session closed for user root
Jan 21 17:13:02 np0005590981.novalocal python3[6638]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:13:02 np0005590981.novalocal python3[6664]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:13:03 np0005590981.novalocal sudo[6742]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgsyaznblsfewmponlzsnetwuezmndxc ; /usr/bin/python3'
Jan 21 17:13:03 np0005590981.novalocal sudo[6742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:13:03 np0005590981.novalocal python3[6744]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:13:03 np0005590981.novalocal sudo[6742]: pam_unix(sudo:session): session closed for user root
Jan 21 17:13:03 np0005590981.novalocal sudo[6815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ragvatdydxmribahesngdpoguahfnuyl ; /usr/bin/python3'
Jan 21 17:13:03 np0005590981.novalocal sudo[6815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:13:03 np0005590981.novalocal python3[6817]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769015582.8919058-362-68460268381911/source _original_basename=tmp287eyvg9 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:13:03 np0005590981.novalocal sudo[6815]: pam_unix(sudo:session): session closed for user root
Jan 21 17:13:03 np0005590981.novalocal sudo[6866]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccuzsxtflgnxrqzimafbrwwzqkywfphe ; /usr/bin/python3'
Jan 21 17:13:03 np0005590981.novalocal sudo[6866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:13:04 np0005590981.novalocal python3[6868]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-d753-8bc2-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:13:04 np0005590981.novalocal sudo[6866]: pam_unix(sudo:session): session closed for user root
Jan 21 17:13:04 np0005590981.novalocal python3[6896]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-d753-8bc2-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 21 17:13:06 np0005590981.novalocal python3[6924]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:13:25 np0005590981.novalocal sudo[6948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgzcmjqpncecwpybqwvldeipieajucot ; /usr/bin/python3'
Jan 21 17:13:25 np0005590981.novalocal sudo[6948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:13:25 np0005590981.novalocal python3[6950]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:13:25 np0005590981.novalocal sudo[6948]: pam_unix(sudo:session): session closed for user root
Jan 21 17:13:28 np0005590981.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 21 17:13:36 np0005590981.novalocal sshd-session[6953]: Connection closed by authenticating user root 64.227.98.100 port 55704 [preauth]
Jan 21 17:14:06 np0005590981.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 21 17:14:06 np0005590981.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 21 17:14:06 np0005590981.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 21 17:14:06 np0005590981.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 21 17:14:06 np0005590981.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 21 17:14:06 np0005590981.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 21 17:14:06 np0005590981.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 21 17:14:06 np0005590981.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 21 17:14:06 np0005590981.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 21 17:14:06 np0005590981.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1385] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 17:14:06 np0005590981.novalocal systemd-udevd[6955]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1553] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1576] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1578] device (eth1): carrier: link connected
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1580] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1585] policy: auto-activating connection 'Wired connection 1' (3cb94cdd-a1f2-3cfa-b2b4-788809cba9e0)
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1588] device (eth1): Activation: starting connection 'Wired connection 1' (3cb94cdd-a1f2-3cfa-b2b4-788809cba9e0)
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1589] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1591] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1594] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:14:06 np0005590981.novalocal NetworkManager[855]: <info>  [1769015646.1598] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:14:07 np0005590981.novalocal python3[6982]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-9cf0-e2ec-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:14:13 np0005590981.novalocal sudo[7060]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpqytvjfyptemrsfmpjijsjjimmasifl ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 17:14:13 np0005590981.novalocal sudo[7060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:14:14 np0005590981.novalocal python3[7062]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:14:14 np0005590981.novalocal sudo[7060]: pam_unix(sudo:session): session closed for user root
Jan 21 17:14:14 np0005590981.novalocal sudo[7133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmubiliteckerjtkrdsbxtpbmyeqgvpd ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 17:14:14 np0005590981.novalocal sudo[7133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:14:14 np0005590981.novalocal python3[7135]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769015653.872597-103-113729540467400/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=8d94bb7fbd7d07b070fba5f7aa501b205857704a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:14:14 np0005590981.novalocal sudo[7133]: pam_unix(sudo:session): session closed for user root
Jan 21 17:14:15 np0005590981.novalocal sudo[7183]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pokywdbhijvuokbegbegsijjmkztktck ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 17:14:15 np0005590981.novalocal sudo[7183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:14:15 np0005590981.novalocal python3[7185]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[855]: <info>  [1769015655.4351] caught SIGTERM, shutting down normally.
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Stopping Network Manager...
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[855]: <info>  [1769015655.4364] dhcp4 (eth0): canceled DHCP transaction
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[855]: <info>  [1769015655.4365] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[855]: <info>  [1769015655.4365] dhcp4 (eth0): state changed no lease
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[855]: <info>  [1769015655.4368] manager: NetworkManager state is now CONNECTING
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[855]: <info>  [1769015655.4525] dhcp4 (eth1): canceled DHCP transaction
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[855]: <info>  [1769015655.4526] dhcp4 (eth1): state changed no lease
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[855]: <info>  [1769015655.4598] exiting (success)
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Stopped Network Manager.
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: NetworkManager.service: Consumed 1.243s CPU time, 10.0M memory peak.
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Starting Network Manager...
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5062] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:23f14126-aa07-49ac-87bf-97b3c4c8c82d)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5064] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5121] manager[0x559176651000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Starting Hostname Service...
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Started Hostname Service.
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5900] hostname: hostname: using hostnamed
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5902] hostname: static hostname changed from (none) to "np0005590981.novalocal"
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5907] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5912] manager[0x559176651000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5913] manager[0x559176651000]: rfkill: WWAN hardware radio set enabled
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5948] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5948] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5949] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5949] manager: Networking is enabled by state file
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5952] settings: Loaded settings plugin: keyfile (internal)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5957] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.5990] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6003] dhcp: init: Using DHCP client 'internal'
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6006] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6013] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6021] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6031] device (lo): Activation: starting connection 'lo' (5c2c2fbe-fb9d-4e67-b102-f58e14318d32)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6039] device (eth0): carrier: link connected
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6045] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6051] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6052] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6060] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6068] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6076] device (eth1): carrier: link connected
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6080] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6088] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (3cb94cdd-a1f2-3cfa-b2b4-788809cba9e0) (indicated)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6088] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6095] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6102] device (eth1): Activation: starting connection 'Wired connection 1' (3cb94cdd-a1f2-3cfa-b2b4-788809cba9e0)
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Started Network Manager.
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6110] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6116] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6120] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6122] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6125] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6129] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6131] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6134] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6137] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6145] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6149] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6163] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6167] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6193] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6197] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 17:14:15 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015655.6204] device (lo): Activation: successful, device activated.
Jan 21 17:14:15 np0005590981.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 21 17:14:15 np0005590981.novalocal sudo[7183]: pam_unix(sudo:session): session closed for user root
Jan 21 17:14:15 np0005590981.novalocal python3[7250]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-9cf0-e2ec-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:14:17 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015657.2509] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 21 17:14:17 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015657.2523] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 17:14:17 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015657.2610] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 17:14:17 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015657.2665] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 17:14:17 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015657.2668] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 17:14:17 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015657.2674] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 17:14:17 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015657.2680] device (eth0): Activation: successful, device activated.
Jan 21 17:14:17 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015657.2688] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 17:14:27 np0005590981.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:14:45 np0005590981.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3380] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 17:15:01 np0005590981.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:15:01 np0005590981.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3684] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3689] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3700] device (eth1): Activation: successful, device activated.
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3714] manager: startup complete
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3717] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <warn>  [1769015701.3728] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3742] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 21 17:15:01 np0005590981.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3904] dhcp4 (eth1): canceled DHCP transaction
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3904] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3905] dhcp4 (eth1): state changed no lease
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3919] policy: auto-activating connection 'ci-private-network' (811a0e92-136b-562e-ae39-d602e48987d5)
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3923] device (eth1): Activation: starting connection 'ci-private-network' (811a0e92-136b-562e-ae39-d602e48987d5)
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3924] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3927] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3934] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.3949] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.4013] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.4015] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:15:01 np0005590981.novalocal NetworkManager[7195]: <info>  [1769015701.4021] device (eth1): Activation: successful, device activated.
Jan 21 17:15:11 np0005590981.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:15:11 np0005590981.novalocal systemd[4309]: Starting Mark boot as successful...
Jan 21 17:15:11 np0005590981.novalocal systemd[4309]: Finished Mark boot as successful.
Jan 21 17:15:16 np0005590981.novalocal sshd-session[4320]: Received disconnect from 38.102.83.114 port 59028:11: disconnected by user
Jan 21 17:15:16 np0005590981.novalocal sshd-session[4320]: Disconnected from user zuul 38.102.83.114 port 59028
Jan 21 17:15:16 np0005590981.novalocal sshd-session[4305]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:15:16 np0005590981.novalocal systemd-logind[782]: Session 1 logged out. Waiting for processes to exit.
Jan 21 17:15:42 np0005590981.novalocal sshd-session[7298]: Invalid user validator from 64.227.98.100 port 44734
Jan 21 17:15:42 np0005590981.novalocal sshd-session[7298]: Connection closed by invalid user validator 64.227.98.100 port 44734 [preauth]
Jan 21 17:15:44 np0005590981.novalocal sshd-session[7300]: Accepted publickey for zuul from 38.102.83.114 port 33194 ssh2: RSA SHA256:ysQWREmd15tAHSjzta5qnkaO4KfePVWGAmHWawN5j2o
Jan 21 17:15:44 np0005590981.novalocal systemd-logind[782]: New session 3 of user zuul.
Jan 21 17:15:44 np0005590981.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 21 17:15:44 np0005590981.novalocal sshd-session[7300]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:15:44 np0005590981.novalocal sudo[7379]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgvhfmfsgucbcgkzchfphvgabdsjmasp ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 17:15:44 np0005590981.novalocal sudo[7379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:15:44 np0005590981.novalocal python3[7381]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:15:44 np0005590981.novalocal sudo[7379]: pam_unix(sudo:session): session closed for user root
Jan 21 17:15:45 np0005590981.novalocal sudo[7452]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvxedguxkoxwbslonbfaepdsiqqtzgdv ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 17:15:45 np0005590981.novalocal sudo[7452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:15:45 np0005590981.novalocal python3[7454]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769015744.4517515-312-125225529184817/source _original_basename=tmpd2y2es7g follow=False checksum=0c62bf202a1f288812fa4e950f292bc5b17eaef9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:15:45 np0005590981.novalocal sudo[7452]: pam_unix(sudo:session): session closed for user root
Jan 21 17:15:48 np0005590981.novalocal sshd-session[7303]: Connection closed by 38.102.83.114 port 33194
Jan 21 17:15:48 np0005590981.novalocal sshd-session[7300]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:15:48 np0005590981.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 21 17:15:48 np0005590981.novalocal systemd-logind[782]: Session 3 logged out. Waiting for processes to exit.
Jan 21 17:15:48 np0005590981.novalocal systemd-logind[782]: Removed session 3.
Jan 21 17:17:58 np0005590981.novalocal sshd-session[7483]: Invalid user validator from 64.227.98.100 port 39484
Jan 21 17:17:58 np0005590981.novalocal sshd-session[7483]: Connection closed by invalid user validator 64.227.98.100 port 39484 [preauth]
Jan 21 17:18:12 np0005590981.novalocal systemd[4309]: Created slice User Background Tasks Slice.
Jan 21 17:18:12 np0005590981.novalocal systemd[4309]: Starting Cleanup of User's Temporary Files and Directories...
Jan 21 17:18:12 np0005590981.novalocal systemd[4309]: Finished Cleanup of User's Temporary Files and Directories.
Jan 21 17:20:10 np0005590981.novalocal sshd-session[7487]: Invalid user nobara from 64.227.98.100 port 37826
Jan 21 17:20:10 np0005590981.novalocal sshd-session[7487]: Connection closed by invalid user nobara 64.227.98.100 port 37826 [preauth]
Jan 21 17:21:25 np0005590981.novalocal sshd-session[7490]: Accepted publickey for zuul from 38.102.83.114 port 58880 ssh2: RSA SHA256:ysQWREmd15tAHSjzta5qnkaO4KfePVWGAmHWawN5j2o
Jan 21 17:21:25 np0005590981.novalocal systemd-logind[782]: New session 4 of user zuul.
Jan 21 17:21:25 np0005590981.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 21 17:21:25 np0005590981.novalocal sshd-session[7490]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:21:25 np0005590981.novalocal sudo[7517]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxhflythqoqkjonmdesjernllorozepp ; /usr/bin/python3'
Jan 21 17:21:25 np0005590981.novalocal sudo[7517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:25 np0005590981.novalocal python3[7519]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-62d3-3c27-00000000216c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:21:25 np0005590981.novalocal sudo[7517]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:26 np0005590981.novalocal sudo[7546]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksodkjbdwtewimdjuikvmtmftulfszqg ; /usr/bin/python3'
Jan 21 17:21:26 np0005590981.novalocal sudo[7546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:26 np0005590981.novalocal python3[7548]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:21:26 np0005590981.novalocal sudo[7546]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:26 np0005590981.novalocal sudo[7572]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhhizaojdvhbmswjviitpzadxhzlfaha ; /usr/bin/python3'
Jan 21 17:21:26 np0005590981.novalocal sudo[7572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:26 np0005590981.novalocal python3[7574]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:21:26 np0005590981.novalocal sudo[7572]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:26 np0005590981.novalocal sudo[7598]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iotvwoyihnvgeppoogsfufvsnyylsktc ; /usr/bin/python3'
Jan 21 17:21:26 np0005590981.novalocal sudo[7598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:26 np0005590981.novalocal python3[7600]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:21:26 np0005590981.novalocal sudo[7598]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:26 np0005590981.novalocal sudo[7624]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-harabdqwodgflacklwqzdjbzhqezzunf ; /usr/bin/python3'
Jan 21 17:21:26 np0005590981.novalocal sudo[7624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:26 np0005590981.novalocal python3[7626]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:21:27 np0005590981.novalocal sudo[7624]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:27 np0005590981.novalocal sudo[7650]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlouoelelsxuvswhhivtyohzkdmadvdu ; /usr/bin/python3'
Jan 21 17:21:27 np0005590981.novalocal sudo[7650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:27 np0005590981.novalocal python3[7652]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:21:27 np0005590981.novalocal sudo[7650]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:27 np0005590981.novalocal sudo[7728]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojoabvutuebrxbimhcynvbuqkhfkqdvf ; /usr/bin/python3'
Jan 21 17:21:27 np0005590981.novalocal sudo[7728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:28 np0005590981.novalocal python3[7730]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:21:28 np0005590981.novalocal sudo[7728]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:28 np0005590981.novalocal sudo[7801]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pftsivvtgxuywkjyxlmldngfjqnbsbmm ; /usr/bin/python3'
Jan 21 17:21:28 np0005590981.novalocal sudo[7801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:28 np0005590981.novalocal python3[7803]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769016087.8741918-512-201811692621172/source _original_basename=tmpv2k_7are follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:21:28 np0005590981.novalocal sudo[7801]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:29 np0005590981.novalocal sudo[7851]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaupjabgeuirjgrhmwcgwfiwcghplzvk ; /usr/bin/python3'
Jan 21 17:21:29 np0005590981.novalocal sudo[7851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:29 np0005590981.novalocal python3[7853]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 17:21:29 np0005590981.novalocal systemd[1]: Reloading.
Jan 21 17:21:29 np0005590981.novalocal systemd-rc-local-generator[7872]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:21:29 np0005590981.novalocal sudo[7851]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:30 np0005590981.novalocal sudo[7907]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whfcqzoyxzfozxhhdevbfcwdsysjzhgd ; /usr/bin/python3'
Jan 21 17:21:30 np0005590981.novalocal sudo[7907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:30 np0005590981.novalocal python3[7909]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 21 17:21:30 np0005590981.novalocal sudo[7907]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:31 np0005590981.novalocal sudo[7933]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pazwxhhdzbzeyhjdbulpkriokymjksyd ; /usr/bin/python3'
Jan 21 17:21:31 np0005590981.novalocal sudo[7933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:31 np0005590981.novalocal python3[7935]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:21:31 np0005590981.novalocal sudo[7933]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:31 np0005590981.novalocal sudo[7961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlodgkhbodnljeqmdnicdmiwmqohiwot ; /usr/bin/python3'
Jan 21 17:21:31 np0005590981.novalocal sudo[7961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:31 np0005590981.novalocal python3[7963]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:21:31 np0005590981.novalocal sudo[7961]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:31 np0005590981.novalocal sudo[7989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsczcxbhurqpoobpblyhvitimglbauqg ; /usr/bin/python3'
Jan 21 17:21:31 np0005590981.novalocal sudo[7989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:31 np0005590981.novalocal python3[7991]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:21:31 np0005590981.novalocal sudo[7989]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:31 np0005590981.novalocal sudo[8017]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tytogwxkarzacgwdvocwjcbeuqpnpiok ; /usr/bin/python3'
Jan 21 17:21:31 np0005590981.novalocal sudo[8017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:32 np0005590981.novalocal python3[8019]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:21:32 np0005590981.novalocal sudo[8017]: pam_unix(sudo:session): session closed for user root
Jan 21 17:21:32 np0005590981.novalocal python3[8046]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-62d3-3c27-000000002173-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:21:33 np0005590981.novalocal python3[8076]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 21 17:21:35 np0005590981.novalocal sshd-session[7493]: Connection closed by 38.102.83.114 port 58880
Jan 21 17:21:35 np0005590981.novalocal sshd-session[7490]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:21:35 np0005590981.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 21 17:21:35 np0005590981.novalocal systemd[1]: session-4.scope: Consumed 3.741s CPU time.
Jan 21 17:21:35 np0005590981.novalocal systemd-logind[782]: Session 4 logged out. Waiting for processes to exit.
Jan 21 17:21:35 np0005590981.novalocal systemd-logind[782]: Removed session 4.
Jan 21 17:21:37 np0005590981.novalocal sshd-session[8081]: Accepted publickey for zuul from 38.102.83.114 port 45806 ssh2: RSA SHA256:ysQWREmd15tAHSjzta5qnkaO4KfePVWGAmHWawN5j2o
Jan 21 17:21:37 np0005590981.novalocal systemd-logind[782]: New session 5 of user zuul.
Jan 21 17:21:37 np0005590981.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 21 17:21:37 np0005590981.novalocal sshd-session[8081]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:21:37 np0005590981.novalocal sudo[8108]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqeevzsuhxiwlebqcqxibpuduqaysjku ; /usr/bin/python3'
Jan 21 17:21:37 np0005590981.novalocal sudo[8108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:21:37 np0005590981.novalocal python3[8110]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 21 17:21:43 np0005590981.novalocal setsebool[8153]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 21 17:21:43 np0005590981.novalocal setsebool[8153]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 21 17:21:54 np0005590981.novalocal kernel: SELinux:  Converting 385 SID table entries...
Jan 21 17:21:54 np0005590981.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:21:54 np0005590981.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 21 17:21:54 np0005590981.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:21:54 np0005590981.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:21:54 np0005590981.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:21:54 np0005590981.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:21:54 np0005590981.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:22:05 np0005590981.novalocal kernel: SELinux:  Converting 388 SID table entries...
Jan 21 17:22:05 np0005590981.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:22:05 np0005590981.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 21 17:22:05 np0005590981.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:22:05 np0005590981.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:22:05 np0005590981.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:22:05 np0005590981.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:22:05 np0005590981.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:22:21 np0005590981.novalocal sshd-session[8885]: Invalid user defi from 64.227.98.100 port 50790
Jan 21 17:22:21 np0005590981.novalocal sshd-session[8885]: Connection closed by invalid user defi 64.227.98.100 port 50790 [preauth]
Jan 21 17:22:23 np0005590981.novalocal dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 21 17:22:23 np0005590981.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 17:22:23 np0005590981.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 21 17:22:23 np0005590981.novalocal systemd[1]: Reloading.
Jan 21 17:22:23 np0005590981.novalocal systemd-rc-local-generator[8928]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:22:23 np0005590981.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 17:22:25 np0005590981.novalocal sudo[8108]: pam_unix(sudo:session): session closed for user root
Jan 21 17:22:29 np0005590981.novalocal python3[13939]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-5ee0-40cd-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:22:30 np0005590981.novalocal kernel: evm: overlay not supported
Jan 21 17:22:30 np0005590981.novalocal systemd[4309]: Starting D-Bus User Message Bus...
Jan 21 17:22:30 np0005590981.novalocal dbus-broker-launch[14344]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 21 17:22:30 np0005590981.novalocal dbus-broker-launch[14344]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 21 17:22:30 np0005590981.novalocal systemd[4309]: Started D-Bus User Message Bus.
Jan 21 17:22:30 np0005590981.novalocal dbus-broker-lau[14344]: Ready
Jan 21 17:22:30 np0005590981.novalocal systemd[4309]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 21 17:22:30 np0005590981.novalocal systemd[4309]: Created slice Slice /user.
Jan 21 17:22:30 np0005590981.novalocal systemd[4309]: podman-14271.scope: unit configures an IP firewall, but not running as root.
Jan 21 17:22:30 np0005590981.novalocal systemd[4309]: (This warning is only shown for the first unit using IP firewalling.)
Jan 21 17:22:30 np0005590981.novalocal systemd[4309]: Started podman-14271.scope.
Jan 21 17:22:30 np0005590981.novalocal systemd[4309]: Started podman-pause-92c679a1.scope.
Jan 21 17:22:31 np0005590981.novalocal sudo[14871]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmfzyjsvpuwivzktyqykpmgoqqlpyymh ; /usr/bin/python3'
Jan 21 17:22:31 np0005590981.novalocal sudo[14871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:22:31 np0005590981.novalocal python3[14883]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.128:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.128:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:22:31 np0005590981.novalocal python3[14883]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 21 17:22:31 np0005590981.novalocal sudo[14871]: pam_unix(sudo:session): session closed for user root
Jan 21 17:22:31 np0005590981.novalocal sshd-session[8084]: Connection closed by 38.102.83.114 port 45806
Jan 21 17:22:31 np0005590981.novalocal sshd-session[8081]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:22:31 np0005590981.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 21 17:22:31 np0005590981.novalocal systemd[1]: session-5.scope: Consumed 42.401s CPU time.
Jan 21 17:22:31 np0005590981.novalocal systemd-logind[782]: Session 5 logged out. Waiting for processes to exit.
Jan 21 17:22:31 np0005590981.novalocal systemd-logind[782]: Removed session 5.
Jan 21 17:22:50 np0005590981.novalocal sshd-session[23778]: Unable to negotiate with 38.102.83.17 port 50728: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 21 17:22:50 np0005590981.novalocal sshd-session[23780]: Unable to negotiate with 38.102.83.17 port 50712: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 21 17:22:50 np0005590981.novalocal sshd-session[23779]: Unable to negotiate with 38.102.83.17 port 50718: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 21 17:22:50 np0005590981.novalocal sshd-session[23783]: Connection closed by 38.102.83.17 port 50696 [preauth]
Jan 21 17:22:50 np0005590981.novalocal sshd-session[23785]: Connection closed by 38.102.83.17 port 50698 [preauth]
Jan 21 17:22:55 np0005590981.novalocal sshd-session[25651]: Accepted publickey for zuul from 38.102.83.114 port 43626 ssh2: RSA SHA256:ysQWREmd15tAHSjzta5qnkaO4KfePVWGAmHWawN5j2o
Jan 21 17:22:55 np0005590981.novalocal systemd-logind[782]: New session 6 of user zuul.
Jan 21 17:22:55 np0005590981.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 21 17:22:55 np0005590981.novalocal sshd-session[25651]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:22:55 np0005590981.novalocal python3[25748]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGx+5/v6p1sZTvlJoqquALZdOIuSs2wH9IZoSHpl0s6ykG/rAIn3cOK4zoFCLX0HhdWod89paRvfFhj4wpMoxh4= zuul@np0005590980.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:22:55 np0005590981.novalocal sudo[25970]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjwmiwuikxeodcjcblofkfpomhtoppmm ; /usr/bin/python3'
Jan 21 17:22:55 np0005590981.novalocal sudo[25970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:22:55 np0005590981.novalocal python3[25981]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGx+5/v6p1sZTvlJoqquALZdOIuSs2wH9IZoSHpl0s6ykG/rAIn3cOK4zoFCLX0HhdWod89paRvfFhj4wpMoxh4= zuul@np0005590980.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:22:55 np0005590981.novalocal sudo[25970]: pam_unix(sudo:session): session closed for user root
Jan 21 17:22:56 np0005590981.novalocal sudo[26316]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhymwjbfrntrvelnpxjyphnbkkjxmdkx ; /usr/bin/python3'
Jan 21 17:22:56 np0005590981.novalocal sudo[26316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:22:56 np0005590981.novalocal python3[26322]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005590981.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 21 17:22:56 np0005590981.novalocal useradd[26402]: new group: name=cloud-admin, GID=1002
Jan 21 17:22:56 np0005590981.novalocal useradd[26402]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 21 17:22:56 np0005590981.novalocal sudo[26316]: pam_unix(sudo:session): session closed for user root
Jan 21 17:22:56 np0005590981.novalocal sudo[26557]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqvjibxmriasxgvejhabuqztabdfoezb ; /usr/bin/python3'
Jan 21 17:22:56 np0005590981.novalocal sudo[26557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:22:57 np0005590981.novalocal python3[26564]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGx+5/v6p1sZTvlJoqquALZdOIuSs2wH9IZoSHpl0s6ykG/rAIn3cOK4zoFCLX0HhdWod89paRvfFhj4wpMoxh4= zuul@np0005590980.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:22:57 np0005590981.novalocal sudo[26557]: pam_unix(sudo:session): session closed for user root
Jan 21 17:22:57 np0005590981.novalocal sudo[26801]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwfuquqqkqguwdpfdgtxismgsetdbdmf ; /usr/bin/python3'
Jan 21 17:22:57 np0005590981.novalocal sudo[26801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:22:57 np0005590981.novalocal python3[26814]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:22:57 np0005590981.novalocal sudo[26801]: pam_unix(sudo:session): session closed for user root
Jan 21 17:22:57 np0005590981.novalocal sudo[27102]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnggmtbcxtmptxnoulcjbgxwudcaqekf ; /usr/bin/python3'
Jan 21 17:22:57 np0005590981.novalocal sudo[27102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:22:58 np0005590981.novalocal python3[27110]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769016177.3488786-151-159799291916515/source _original_basename=tmpiupucfso follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:22:58 np0005590981.novalocal sudo[27102]: pam_unix(sudo:session): session closed for user root
Jan 21 17:22:58 np0005590981.novalocal sudo[27405]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-engbfiwwcmicnfynrlcfxpmuaourpxeq ; /usr/bin/python3'
Jan 21 17:22:58 np0005590981.novalocal sudo[27405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:22:59 np0005590981.novalocal python3[27407]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 21 17:22:59 np0005590981.novalocal systemd[1]: Starting Hostname Service...
Jan 21 17:22:59 np0005590981.novalocal systemd[1]: Started Hostname Service.
Jan 21 17:22:59 np0005590981.novalocal systemd-hostnamed[27506]: Changed pretty hostname to 'compute-0'
Jan 21 17:22:59 compute-0 systemd-hostnamed[27506]: Hostname set to <compute-0> (static)
Jan 21 17:22:59 compute-0 NetworkManager[7195]: <info>  [1769016179.3180] hostname: static hostname changed from "np0005590981.novalocal" to "compute-0"
Jan 21 17:22:59 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:22:59 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:22:59 compute-0 sudo[27405]: pam_unix(sudo:session): session closed for user root
Jan 21 17:22:59 compute-0 sshd-session[25695]: Connection closed by 38.102.83.114 port 43626
Jan 21 17:22:59 compute-0 sshd-session[25651]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:22:59 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Jan 21 17:22:59 compute-0 systemd[1]: session-6.scope: Consumed 2.223s CPU time.
Jan 21 17:22:59 compute-0 systemd-logind[782]: Session 6 logged out. Waiting for processes to exit.
Jan 21 17:22:59 compute-0 systemd-logind[782]: Removed session 6.
Jan 21 17:23:06 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 17:23:06 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 17:23:06 compute-0 systemd[1]: man-db-cache-update.service: Consumed 49.214s CPU time.
Jan 21 17:23:06 compute-0 systemd[1]: run-rd3fe4d3f12404f4d9d8e9e3d9ef3cc0f.service: Deactivated successfully.
Jan 21 17:23:09 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:23:29 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 17:24:40 compute-0 sshd-session[29940]: Connection closed by authenticating user root 64.227.98.100 port 50656 [preauth]
Jan 21 17:26:11 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 21 17:26:11 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 21 17:26:11 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 21 17:26:11 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 21 17:26:56 compute-0 sshd-session[29948]: Invalid user blockchain from 64.227.98.100 port 39770
Jan 21 17:26:56 compute-0 sshd-session[29948]: Connection closed by invalid user blockchain 64.227.98.100 port 39770 [preauth]
Jan 21 17:27:32 compute-0 sshd-session[29951]: Accepted publickey for zuul from 38.102.83.17 port 60834 ssh2: RSA SHA256:ysQWREmd15tAHSjzta5qnkaO4KfePVWGAmHWawN5j2o
Jan 21 17:27:32 compute-0 systemd-logind[782]: New session 7 of user zuul.
Jan 21 17:27:32 compute-0 systemd[1]: Started Session 7 of User zuul.
Jan 21 17:27:32 compute-0 sshd-session[29951]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:27:33 compute-0 python3[30027]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:27:34 compute-0 sudo[30141]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koqralrqwdpdqtmodycctohfgngmenyz ; /usr/bin/python3'
Jan 21 17:27:34 compute-0 sudo[30141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:34 compute-0 python3[30143]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:27:34 compute-0 sudo[30141]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:35 compute-0 sudo[30214]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbuzzkmjlkzwyauocxbxvllaqkreomlm ; /usr/bin/python3'
Jan 21 17:27:35 compute-0 sudo[30214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:35 compute-0 python3[30216]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769016454.518309-33819-220946832993388/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:27:35 compute-0 sudo[30214]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:35 compute-0 sudo[30240]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-magjehqbnxlvcnzklaymikuillvmsuxw ; /usr/bin/python3'
Jan 21 17:27:35 compute-0 sudo[30240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:35 compute-0 python3[30242]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:27:35 compute-0 sudo[30240]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:35 compute-0 sudo[30313]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcdhdfraljtmrbiomiswvlrocvfagjmb ; /usr/bin/python3'
Jan 21 17:27:35 compute-0 sudo[30313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:35 compute-0 python3[30315]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769016454.518309-33819-220946832993388/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:27:35 compute-0 sudo[30313]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:35 compute-0 sudo[30339]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkcbmddmmhutpjtubkaqxsirqlyfaqch ; /usr/bin/python3'
Jan 21 17:27:35 compute-0 sudo[30339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:36 compute-0 python3[30341]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:27:36 compute-0 sudo[30339]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:36 compute-0 sudo[30412]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goabegfxidvykzqiskezaubethesgkdz ; /usr/bin/python3'
Jan 21 17:27:36 compute-0 sudo[30412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:36 compute-0 python3[30414]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769016454.518309-33819-220946832993388/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:27:36 compute-0 sudo[30412]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:36 compute-0 sudo[30438]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxrwdxvfgtkfzlblugxfsdtezmvxnmfu ; /usr/bin/python3'
Jan 21 17:27:36 compute-0 sudo[30438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:36 compute-0 python3[30440]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:27:36 compute-0 sudo[30438]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:36 compute-0 sudo[30511]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfesscqslfrxmxkypdwvwterohifwrqo ; /usr/bin/python3'
Jan 21 17:27:36 compute-0 sudo[30511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:37 compute-0 python3[30513]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769016454.518309-33819-220946832993388/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:27:37 compute-0 sudo[30511]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:37 compute-0 sudo[30537]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xetwsawmedqmzjdyaxrkiggejxuzcyut ; /usr/bin/python3'
Jan 21 17:27:37 compute-0 sudo[30537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:37 compute-0 python3[30539]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:27:37 compute-0 sudo[30537]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:37 compute-0 sudo[30610]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sepcjhgeeboixgvtbaxkotcxrofnryzi ; /usr/bin/python3'
Jan 21 17:27:37 compute-0 sudo[30610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:37 compute-0 python3[30612]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769016454.518309-33819-220946832993388/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:27:37 compute-0 sudo[30610]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:37 compute-0 sudo[30636]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzusudkduykuixuotlyycaphnanmyxtk ; /usr/bin/python3'
Jan 21 17:27:37 compute-0 sudo[30636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:37 compute-0 python3[30638]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:27:37 compute-0 sudo[30636]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:38 compute-0 sudo[30709]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haeecltujulzpnjzoksnzylntvaeotju ; /usr/bin/python3'
Jan 21 17:27:38 compute-0 sudo[30709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:38 compute-0 python3[30711]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769016454.518309-33819-220946832993388/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:27:38 compute-0 sudo[30709]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:38 compute-0 sudo[30735]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onsusagceqguvhggtegptlpdreuqdgmu ; /usr/bin/python3'
Jan 21 17:27:38 compute-0 sudo[30735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:38 compute-0 python3[30737]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:27:38 compute-0 sudo[30735]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:38 compute-0 sudo[30808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keeebceewmttqvrbrazvbictyhctoczj ; /usr/bin/python3'
Jan 21 17:27:38 compute-0 sudo[30808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:27:38 compute-0 python3[30810]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769016454.518309-33819-220946832993388/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:27:38 compute-0 sudo[30808]: pam_unix(sudo:session): session closed for user root
Jan 21 17:27:41 compute-0 sshd-session[30835]: Connection closed by 192.168.122.11 port 33162 [preauth]
Jan 21 17:27:41 compute-0 sshd-session[30836]: Connection closed by 192.168.122.11 port 33164 [preauth]
Jan 21 17:27:41 compute-0 sshd-session[30837]: Unable to negotiate with 192.168.122.11 port 33178: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 21 17:27:41 compute-0 sshd-session[30838]: Unable to negotiate with 192.168.122.11 port 33188: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 21 17:27:41 compute-0 sshd-session[30839]: Unable to negotiate with 192.168.122.11 port 33194: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 21 17:28:36 compute-0 python3[30868]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:29:10 compute-0 sshd-session[30870]: Invalid user blockchain from 64.227.98.100 port 47916
Jan 21 17:29:10 compute-0 sshd-session[30870]: Connection closed by invalid user blockchain 64.227.98.100 port 47916 [preauth]
Jan 21 17:31:27 compute-0 sshd-session[30873]: Invalid user defi from 64.227.98.100 port 60256
Jan 21 17:31:27 compute-0 sshd-session[30873]: Connection closed by invalid user defi 64.227.98.100 port 60256 [preauth]
Jan 21 17:33:35 compute-0 sshd-session[29954]: Received disconnect from 38.102.83.17 port 60834:11: disconnected by user
Jan 21 17:33:35 compute-0 sshd-session[29954]: Disconnected from user zuul 38.102.83.17 port 60834
Jan 21 17:33:35 compute-0 sshd-session[29951]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:33:35 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Jan 21 17:33:35 compute-0 systemd[1]: session-7.scope: Consumed 4.659s CPU time.
Jan 21 17:33:35 compute-0 systemd-logind[782]: Session 7 logged out. Waiting for processes to exit.
Jan 21 17:33:35 compute-0 systemd-logind[782]: Removed session 7.
Jan 21 17:33:37 compute-0 sshd-session[30876]: Invalid user user from 64.227.98.100 port 57244
Jan 21 17:33:37 compute-0 sshd-session[30876]: Connection closed by invalid user user 64.227.98.100 port 57244 [preauth]
Jan 21 17:35:46 compute-0 sshd-session[30878]: Connection closed by authenticating user root 64.227.98.100 port 42872 [preauth]
Jan 21 17:38:03 compute-0 sshd-session[30882]: Connection closed by authenticating user root 64.227.98.100 port 54908 [preauth]
Jan 21 17:40:16 compute-0 sshd-session[30884]: Connection closed by authenticating user root 64.227.98.100 port 59428 [preauth]
Jan 21 17:41:17 compute-0 sshd-session[30886]: Invalid user user from 45.148.10.121 port 36408
Jan 21 17:41:17 compute-0 sshd-session[30886]: Connection closed by invalid user user 45.148.10.121 port 36408 [preauth]
Jan 21 17:41:56 compute-0 sshd-session[30888]: Accepted publickey for zuul from 192.168.122.30 port 52490 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:41:56 compute-0 systemd-logind[782]: New session 8 of user zuul.
Jan 21 17:41:56 compute-0 systemd[1]: Started Session 8 of User zuul.
Jan 21 17:41:56 compute-0 sshd-session[30888]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:41:57 compute-0 python3.9[31041]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:41:59 compute-0 sudo[31220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qumqyvvrvitkmmeuaxlkumovcyimahto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017319.5483403-39-238430766096049/AnsiballZ_command.py'
Jan 21 17:41:59 compute-0 sudo[31220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:00 compute-0 python3.9[31222]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:42:07 compute-0 sudo[31220]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:10 compute-0 sshd-session[30891]: Connection closed by 192.168.122.30 port 52490
Jan 21 17:42:10 compute-0 sshd-session[30888]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:42:10 compute-0 systemd-logind[782]: Session 8 logged out. Waiting for processes to exit.
Jan 21 17:42:10 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Jan 21 17:42:10 compute-0 systemd[1]: session-8.scope: Consumed 7.507s CPU time.
Jan 21 17:42:10 compute-0 systemd-logind[782]: Removed session 8.
Jan 21 17:42:15 compute-0 sshd-session[31280]: Accepted publickey for zuul from 192.168.122.30 port 52456 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:42:15 compute-0 systemd-logind[782]: New session 9 of user zuul.
Jan 21 17:42:15 compute-0 systemd[1]: Started Session 9 of User zuul.
Jan 21 17:42:15 compute-0 sshd-session[31280]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:42:16 compute-0 python3.9[31433]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:42:17 compute-0 sshd-session[31283]: Connection closed by 192.168.122.30 port 52456
Jan 21 17:42:17 compute-0 sshd-session[31280]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:42:17 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Jan 21 17:42:17 compute-0 systemd-logind[782]: Session 9 logged out. Waiting for processes to exit.
Jan 21 17:42:17 compute-0 systemd-logind[782]: Removed session 9.
Jan 21 17:42:33 compute-0 sshd-session[31464]: Accepted publickey for zuul from 192.168.122.30 port 58372 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:42:33 compute-0 systemd-logind[782]: New session 10 of user zuul.
Jan 21 17:42:33 compute-0 systemd[1]: Started Session 10 of User zuul.
Jan 21 17:42:33 compute-0 sshd-session[31464]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:42:33 compute-0 sshd-session[31462]: Invalid user polkadot from 64.227.98.100 port 43814
Jan 21 17:42:33 compute-0 sshd-session[31462]: Connection closed by invalid user polkadot 64.227.98.100 port 43814 [preauth]
Jan 21 17:42:33 compute-0 python3.9[31617]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 21 17:42:35 compute-0 python3.9[31791]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:42:36 compute-0 sudo[31941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzonlqmluhqmoeteksqotcrdgdlcjghe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017355.587206-64-10631202110768/AnsiballZ_command.py'
Jan 21 17:42:36 compute-0 sudo[31941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:36 compute-0 python3.9[31943]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:42:36 compute-0 sudo[31941]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:36 compute-0 sudo[32094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldasesbyokcvmyegtgdfzejlprbcjmlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017356.5487413-88-153128588509629/AnsiballZ_stat.py'
Jan 21 17:42:36 compute-0 sudo[32094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:37 compute-0 python3.9[32096]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:42:37 compute-0 sudo[32094]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:37 compute-0 sudo[32246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxoagypcnzzxvreifzmoecxvdxvazovl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017357.3551376-104-58484223421503/AnsiballZ_file.py'
Jan 21 17:42:37 compute-0 sudo[32246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:37 compute-0 python3.9[32248]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:42:37 compute-0 sudo[32246]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:38 compute-0 sudo[32398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcenwyfijogynunsdidclabvcakklfki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017358.1508784-120-104841172046628/AnsiballZ_stat.py'
Jan 21 17:42:38 compute-0 sudo[32398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:38 compute-0 python3.9[32400]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:42:38 compute-0 sudo[32398]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:39 compute-0 sudo[32521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjvsdzdpgactyybcwfnxvrosrvofbdpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017358.1508784-120-104841172046628/AnsiballZ_copy.py'
Jan 21 17:42:39 compute-0 sudo[32521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:39 compute-0 python3.9[32523]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017358.1508784-120-104841172046628/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:42:39 compute-0 sudo[32521]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:39 compute-0 sudo[32673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndcalvyzuzubaplfcdljzfnyyunrwhks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017359.5029545-150-131232074809345/AnsiballZ_setup.py'
Jan 21 17:42:39 compute-0 sudo[32673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:40 compute-0 python3.9[32675]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:42:40 compute-0 sudo[32673]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:40 compute-0 sudo[32829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmjbikmzsqgrguynequvtxlnmynjzfwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017360.593524-166-124174576655600/AnsiballZ_file.py'
Jan 21 17:42:40 compute-0 sudo[32829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:41 compute-0 python3.9[32831]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:42:41 compute-0 sudo[32829]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:41 compute-0 sudo[32981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqpgqqfitpfvbyhnbuarcqtrajbzstis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017361.488926-184-155665001320398/AnsiballZ_file.py'
Jan 21 17:42:41 compute-0 sudo[32981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:41 compute-0 python3.9[32983]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:42:41 compute-0 sudo[32981]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:42 compute-0 python3.9[33133]: ansible-ansible.builtin.service_facts Invoked
Jan 21 17:42:46 compute-0 python3.9[33386]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:42:47 compute-0 python3.9[33536]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:42:48 compute-0 python3.9[33690]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:42:49 compute-0 sudo[33846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juvjviyxktbckovereowpgyufmarkaay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017369.0690053-280-272104385547434/AnsiballZ_setup.py'
Jan 21 17:42:49 compute-0 sudo[33846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:49 compute-0 python3.9[33848]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:42:49 compute-0 sudo[33846]: pam_unix(sudo:session): session closed for user root
Jan 21 17:42:50 compute-0 sudo[33930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xykfptephemzlzfocmaypuuisbgvdjtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017369.0690053-280-272104385547434/AnsiballZ_dnf.py'
Jan 21 17:42:50 compute-0 sudo[33930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:42:50 compute-0 python3.9[33932]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:43:16 compute-0 sshd-session[34070]: Connection closed by 39.191.29.114 port 52052
Jan 21 17:43:34 compute-0 systemd[1]: Reloading.
Jan 21 17:43:34 compute-0 systemd-rc-local-generator[34125]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:43:34 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 21 17:43:34 compute-0 systemd[1]: Reloading.
Jan 21 17:43:34 compute-0 systemd-rc-local-generator[34178]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:43:35 compute-0 systemd[1]: Starting dnf makecache...
Jan 21 17:43:35 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 21 17:43:35 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 21 17:43:35 compute-0 systemd[1]: Reloading.
Jan 21 17:43:35 compute-0 systemd-rc-local-generator[34212]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:43:35 compute-0 dnf[34186]: Failed determining last makecache time.
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-barbican-42b4c41831408a8e323 171 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 196 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-cinder-1c00d6490d88e436f26ef 203 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-python-stevedore-c4acc5639fd2329372142 195 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-python-cloudkitty-tests-tempest-2c80f8 197 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-os-refresh-config-9bfc52b5049be2d8de61 203 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 213 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-python-designate-tests-tempest-347fdbc 198 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-glance-1fd12c29b339f30fe823e 202 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Jan 21 17:43:35 compute-0 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Jan 21 17:43:35 compute-0 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 112 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-manila-3c01b7181572c95dac462 139 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-python-whitebox-neutron-tests-tempest- 136 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-octavia-ba397f07a7331190208c 134 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-watcher-c014f81a8647287f6dcc 137 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-ansible-config_template-5ccaa22121a7ff 147 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 147 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-swift-dc98a8463506ac520c469a 153 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-python-tempestconf-8515371b7cceebd4282 143 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: delorean-openstack-heat-ui-013accbfd179753bc3f0 139 kB/s | 3.0 kB     00:00
Jan 21 17:43:35 compute-0 dnf[34186]: CentOS Stream 9 - BaseOS                         57 kB/s | 6.7 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: CentOS Stream 9 - AppStream                      67 kB/s | 6.8 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: CentOS Stream 9 - CRB                            65 kB/s | 6.6 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: CentOS Stream 9 - Extras packages                60 kB/s | 7.3 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: dlrn-antelope-testing                           186 kB/s | 3.0 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: dlrn-antelope-build-deps                        207 kB/s | 3.0 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: centos9-rabbitmq                                125 kB/s | 3.0 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: centos9-storage                                 139 kB/s | 3.0 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: centos9-opstools                                132 kB/s | 3.0 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: NFV SIG OpenvSwitch                              79 kB/s | 3.0 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: repo-setup-centos-appstream                      95 kB/s | 4.4 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: repo-setup-centos-baseos                         95 kB/s | 3.9 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: repo-setup-centos-highavailability              151 kB/s | 3.9 kB     00:00
Jan 21 17:43:36 compute-0 dnf[34186]: repo-setup-centos-powertools                    181 kB/s | 4.3 kB     00:00
Jan 21 17:43:37 compute-0 dnf[34186]: Extra Packages for Enterprise Linux 9 - x86_64  235 kB/s |  31 kB     00:00
Jan 21 17:43:37 compute-0 dnf[34186]: Metadata cache created.
Jan 21 17:43:37 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 21 17:43:37 compute-0 systemd[1]: Finished dnf makecache.
Jan 21 17:43:37 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.784s CPU time.
Jan 21 17:44:39 compute-0 kernel: SELinux:  Converting 2724 SID table entries...
Jan 21 17:44:39 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:44:39 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 17:44:39 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:44:39 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:44:39 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:44:39 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:44:39 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:44:40 compute-0 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 21 17:44:40 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 17:44:40 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 17:44:40 compute-0 systemd[1]: Reloading.
Jan 21 17:44:40 compute-0 systemd-rc-local-generator[34619]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:44:40 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 17:44:40 compute-0 sudo[33930]: pam_unix(sudo:session): session closed for user root
Jan 21 17:44:41 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 17:44:41 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 17:44:41 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.188s CPU time.
Jan 21 17:44:41 compute-0 systemd[1]: run-rb9bf279f5d6d434fb9855ccf7443af07.service: Deactivated successfully.
Jan 21 17:44:54 compute-0 sshd-session[35405]: Invalid user node from 64.227.98.100 port 60160
Jan 21 17:44:54 compute-0 sshd-session[35405]: Connection closed by invalid user node 64.227.98.100 port 60160 [preauth]
Jan 21 17:45:03 compute-0 sudo[35532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqkzbfrdlnvfpgcxgcdrawatanlakaey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017503.2057526-304-131002876185406/AnsiballZ_command.py'
Jan 21 17:45:03 compute-0 sudo[35532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:03 compute-0 python3.9[35534]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:45:04 compute-0 sudo[35532]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:05 compute-0 sudo[35813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ratnseswjfgrpimftemdlnveplzaeflh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017504.6147964-320-75012794792945/AnsiballZ_selinux.py'
Jan 21 17:45:05 compute-0 sudo[35813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:05 compute-0 python3.9[35815]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 21 17:45:05 compute-0 sudo[35813]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:06 compute-0 sudo[35965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdsbzwnujvoksjnwgahgvramjyloyxrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017506.0037522-342-105086480910492/AnsiballZ_command.py'
Jan 21 17:45:06 compute-0 sudo[35965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:06 compute-0 python3.9[35967]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 21 17:45:07 compute-0 sudo[35965]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:09 compute-0 sudo[36118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raudmhkuqqkatyjkfsewtxolwnbqogrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017509.2795913-358-257238327385500/AnsiballZ_file.py'
Jan 21 17:45:09 compute-0 sudo[36118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:09 compute-0 python3.9[36120]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:09 compute-0 sudo[36118]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:10 compute-0 sudo[36270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhgikqehykupwjkhcilgignmeoeryttp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017510.128456-374-165498417112167/AnsiballZ_mount.py'
Jan 21 17:45:10 compute-0 sudo[36270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:11 compute-0 python3.9[36272]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 21 17:45:11 compute-0 sudo[36270]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:12 compute-0 sudo[36422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqzvhpcocnukncuvuovvolkkgizhosmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017512.3105168-430-107951274098819/AnsiballZ_file.py'
Jan 21 17:45:12 compute-0 sudo[36422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:14 compute-0 python3.9[36424]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:45:14 compute-0 sudo[36422]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:14 compute-0 sudo[36574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuvioszxbxpdrwgeolebxcoocuwbhama ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017514.2354267-446-177429925726194/AnsiballZ_stat.py'
Jan 21 17:45:14 compute-0 sudo[36574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:16 compute-0 python3.9[36576]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:45:16 compute-0 sudo[36574]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:16 compute-0 sudo[36698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajwxlvagymgkukbahbeidpxovfeyllvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017514.2354267-446-177429925726194/AnsiballZ_copy.py'
Jan 21 17:45:16 compute-0 sudo[36698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:16 compute-0 python3.9[36700]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017514.2354267-446-177429925726194/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b2ca5e5f576f827289b8dc0eb476f75fc973645a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:16 compute-0 sudo[36698]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:17 compute-0 sudo[36850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weykbucgssfpvyctkyfvfebjomngbdvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017517.4919136-494-112575895618486/AnsiballZ_stat.py'
Jan 21 17:45:17 compute-0 sudo[36850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:18 compute-0 python3.9[36852]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:45:18 compute-0 sudo[36850]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:19 compute-0 sudo[37002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rygqjcrkgnkfihypkauqqbjprvcbceth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017519.0866275-510-121984816513325/AnsiballZ_command.py'
Jan 21 17:45:19 compute-0 sudo[37002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:19 compute-0 python3.9[37004]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:45:19 compute-0 sudo[37002]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:21 compute-0 sudo[37155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xipscoybmjylqgbiylzixqqbyeexetok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017521.4186583-526-138999331780644/AnsiballZ_file.py'
Jan 21 17:45:21 compute-0 sudo[37155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:21 compute-0 python3.9[37157]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:21 compute-0 sudo[37155]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:22 compute-0 sudo[37307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fucarfjalizaaeeuakcadctrapknrhwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017522.31398-548-251068178669348/AnsiballZ_getent.py'
Jan 21 17:45:22 compute-0 sudo[37307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:22 compute-0 python3.9[37309]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 21 17:45:22 compute-0 sudo[37307]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:22 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 17:45:22 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 17:45:23 compute-0 sudo[37461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwjeeisrtnjcbcewjbijxtivppkrfdij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017523.1935854-564-202611397208605/AnsiballZ_group.py'
Jan 21 17:45:23 compute-0 sudo[37461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:23 compute-0 python3.9[37463]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 17:45:23 compute-0 groupadd[37464]: group added to /etc/group: name=qemu, GID=107
Jan 21 17:45:23 compute-0 groupadd[37464]: group added to /etc/gshadow: name=qemu
Jan 21 17:45:23 compute-0 groupadd[37464]: new group: name=qemu, GID=107
Jan 21 17:45:23 compute-0 sudo[37461]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:24 compute-0 sudo[37619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgwvfqxgspgahaplnnlmphyqbubaoxdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017524.260895-580-3811518335813/AnsiballZ_user.py'
Jan 21 17:45:24 compute-0 sudo[37619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:25 compute-0 python3.9[37621]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 17:45:25 compute-0 useradd[37623]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 17:45:25 compute-0 sudo[37619]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:25 compute-0 sudo[37779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuapbozzkgrmoioecqwquetvystmdlaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017525.3345244-596-12168419868724/AnsiballZ_getent.py'
Jan 21 17:45:25 compute-0 sudo[37779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:25 compute-0 python3.9[37781]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 21 17:45:25 compute-0 sudo[37779]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:26 compute-0 sudo[37932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdowwuumqwkeodiyozhcoibkzjmvqxta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017526.0226576-612-160866466200626/AnsiballZ_group.py'
Jan 21 17:45:26 compute-0 sudo[37932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:26 compute-0 python3.9[37934]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 17:45:26 compute-0 groupadd[37935]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 21 17:45:26 compute-0 groupadd[37935]: group added to /etc/gshadow: name=hugetlbfs
Jan 21 17:45:26 compute-0 groupadd[37935]: new group: name=hugetlbfs, GID=42477
Jan 21 17:45:26 compute-0 sudo[37932]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:27 compute-0 sudo[38090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irfwvjbwzolwgigxsyfttqhgmkcsreao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017526.9127553-630-85468480013054/AnsiballZ_file.py'
Jan 21 17:45:27 compute-0 sudo[38090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:27 compute-0 python3.9[38092]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 21 17:45:27 compute-0 sudo[38090]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:28 compute-0 sudo[38242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibkgtbosuzjjybfiysqbenrshitayqct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017527.8189-652-30622389864190/AnsiballZ_dnf.py'
Jan 21 17:45:28 compute-0 sudo[38242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:28 compute-0 python3.9[38244]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:45:29 compute-0 sudo[38242]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:31 compute-0 sudo[38396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iducfdqurbrvfvrzhcpgyxtaqhuduseg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017531.5084043-668-82435870346879/AnsiballZ_file.py'
Jan 21 17:45:31 compute-0 sudo[38396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:31 compute-0 python3.9[38398]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:45:32 compute-0 sudo[38396]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:32 compute-0 sudo[38548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wirrbkgrpikqumkasflozhvdimikdifu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017532.2805738-684-109186961177843/AnsiballZ_stat.py'
Jan 21 17:45:32 compute-0 sudo[38548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:32 compute-0 python3.9[38550]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:45:32 compute-0 sudo[38548]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:33 compute-0 sudo[38671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqypuyryptmwhmeyvmavczxduzgzsszs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017532.2805738-684-109186961177843/AnsiballZ_copy.py'
Jan 21 17:45:33 compute-0 sudo[38671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:33 compute-0 python3.9[38673]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769017532.2805738-684-109186961177843/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:45:33 compute-0 sudo[38671]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:34 compute-0 sudo[38823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byimrutwflvykihdytnewiiuhrtvsxta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017533.6404135-714-242733862868385/AnsiballZ_systemd.py'
Jan 21 17:45:34 compute-0 sudo[38823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:34 compute-0 python3.9[38825]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 17:45:34 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 21 17:45:34 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 21 17:45:34 compute-0 kernel: Bridge firewalling registered
Jan 21 17:45:34 compute-0 systemd-modules-load[38829]: Inserted module 'br_netfilter'
Jan 21 17:45:34 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 21 17:45:34 compute-0 sudo[38823]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:35 compute-0 sudo[38983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuvrchwrirliqahztrcxldczjxjuwwsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017534.819228-730-258123868867703/AnsiballZ_stat.py'
Jan 21 17:45:35 compute-0 sudo[38983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:35 compute-0 python3.9[38985]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:45:35 compute-0 sudo[38983]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:35 compute-0 sudo[39106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzumngpjsavzmlkamgxzwycxwkqvcmri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017534.819228-730-258123868867703/AnsiballZ_copy.py'
Jan 21 17:45:35 compute-0 sudo[39106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:35 compute-0 python3.9[39108]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769017534.819228-730-258123868867703/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:45:35 compute-0 sudo[39106]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:36 compute-0 sudo[39258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huzowmutubbkkvjigooozyrtvgplxwta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017536.352648-766-191484250000509/AnsiballZ_dnf.py'
Jan 21 17:45:36 compute-0 sudo[39258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:36 compute-0 python3.9[39260]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:45:39 compute-0 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Jan 21 17:45:39 compute-0 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Jan 21 17:45:40 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 17:45:40 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 17:45:40 compute-0 systemd[1]: Reloading.
Jan 21 17:45:40 compute-0 systemd-rc-local-generator[39324]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:45:40 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 17:45:40 compute-0 sudo[39258]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:43 compute-0 python3.9[42245]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:45:43 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 17:45:43 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 17:45:43 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.475s CPU time.
Jan 21 17:45:43 compute-0 systemd[1]: run-r90f1fb76979d4980a934d9ebd3b2a4d1.service: Deactivated successfully.
Jan 21 17:45:43 compute-0 python3.9[43125]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 21 17:45:44 compute-0 python3.9[43276]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:45:45 compute-0 sudo[43426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csuquinnwckwvoogaguwojrlzvuajtgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017545.0148966-844-93847766229156/AnsiballZ_command.py'
Jan 21 17:45:45 compute-0 sudo[43426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:45 compute-0 python3.9[43428]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:45:45 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 21 17:45:45 compute-0 systemd[1]: Starting Authorization Manager...
Jan 21 17:45:46 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 21 17:45:46 compute-0 polkitd[43645]: Started polkitd version 0.117
Jan 21 17:45:46 compute-0 polkitd[43645]: Loading rules from directory /etc/polkit-1/rules.d
Jan 21 17:45:46 compute-0 polkitd[43645]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 21 17:45:46 compute-0 polkitd[43645]: Finished loading, compiling and executing 2 rules
Jan 21 17:45:46 compute-0 systemd[1]: Started Authorization Manager.
Jan 21 17:45:46 compute-0 polkitd[43645]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 21 17:45:46 compute-0 sudo[43426]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:46 compute-0 sudo[43813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxzggznhpcnbnamnjykuvnlpndvyuzzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017546.5743186-862-259447255052110/AnsiballZ_systemd.py'
Jan 21 17:45:46 compute-0 sudo[43813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:47 compute-0 python3.9[43815]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:45:47 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 21 17:45:47 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Jan 21 17:45:47 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 21 17:45:47 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 21 17:45:47 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 21 17:45:47 compute-0 sudo[43813]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:48 compute-0 python3.9[43976]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 21 17:45:51 compute-0 sudo[44126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkfgosrlkorbwzqvnqtqfocvednonupu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017550.9556844-976-204722678693892/AnsiballZ_systemd.py'
Jan 21 17:45:51 compute-0 sudo[44126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:51 compute-0 python3.9[44128]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:45:51 compute-0 systemd[1]: Reloading.
Jan 21 17:45:51 compute-0 systemd-rc-local-generator[44153]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:45:51 compute-0 sudo[44126]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:52 compute-0 sudo[44314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxxyjoagpsidzhgkawrkwfonxoifqodz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017552.0688417-976-29200585617441/AnsiballZ_systemd.py'
Jan 21 17:45:52 compute-0 sudo[44314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:52 compute-0 python3.9[44316]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:45:52 compute-0 systemd[1]: Reloading.
Jan 21 17:45:52 compute-0 systemd-rc-local-generator[44347]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:45:52 compute-0 sudo[44314]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:53 compute-0 sudo[44503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjclsgczahmirqrzoxjuhfmlyjwkmlxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017553.3750691-1008-223162078421958/AnsiballZ_command.py'
Jan 21 17:45:53 compute-0 sudo[44503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:53 compute-0 python3.9[44505]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:45:53 compute-0 sudo[44503]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:54 compute-0 sudo[44656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcthnfzpbudbddwhtdepyyqfumjgxwkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017554.168598-1024-12807904051921/AnsiballZ_command.py'
Jan 21 17:45:54 compute-0 sudo[44656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:54 compute-0 python3.9[44658]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:45:54 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 21 17:45:54 compute-0 sudo[44656]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:55 compute-0 sudo[44809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urogdrctgotrmbvaccvzxmjqesxdrjzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017554.943876-1040-50934579384685/AnsiballZ_command.py'
Jan 21 17:45:55 compute-0 sudo[44809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:55 compute-0 python3.9[44811]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:45:56 compute-0 sudo[44809]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:57 compute-0 sudo[44971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwfvjbuipaigwmdeokdcmpxhvmdfypqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017557.230294-1056-29159436511143/AnsiballZ_command.py'
Jan 21 17:45:57 compute-0 sudo[44971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:57 compute-0 python3.9[44973]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:45:57 compute-0 sudo[44971]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:58 compute-0 sudo[45124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mydhcmthvnrtcbkejqwbfvxgejihheti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017557.9180005-1072-212149221315980/AnsiballZ_systemd.py'
Jan 21 17:45:58 compute-0 sudo[45124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:45:58 compute-0 python3.9[45126]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 17:45:58 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 21 17:45:58 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Jan 21 17:45:58 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Jan 21 17:45:58 compute-0 systemd[1]: Starting Apply Kernel Variables...
Jan 21 17:45:58 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 21 17:45:58 compute-0 systemd[1]: Finished Apply Kernel Variables.
Jan 21 17:45:58 compute-0 sudo[45124]: pam_unix(sudo:session): session closed for user root
Jan 21 17:45:59 compute-0 sshd-session[31467]: Connection closed by 192.168.122.30 port 58372
Jan 21 17:45:59 compute-0 sshd-session[31464]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:45:59 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Jan 21 17:45:59 compute-0 systemd[1]: session-10.scope: Consumed 2min 13.007s CPU time.
Jan 21 17:45:59 compute-0 systemd-logind[782]: Session 10 logged out. Waiting for processes to exit.
Jan 21 17:45:59 compute-0 systemd-logind[782]: Removed session 10.
Jan 21 17:46:04 compute-0 sshd-session[45156]: Accepted publickey for zuul from 192.168.122.30 port 45732 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:46:04 compute-0 systemd-logind[782]: New session 11 of user zuul.
Jan 21 17:46:04 compute-0 systemd[1]: Started Session 11 of User zuul.
Jan 21 17:46:04 compute-0 sshd-session[45156]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:46:05 compute-0 python3.9[45309]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:46:06 compute-0 python3.9[45463]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:46:07 compute-0 sudo[45617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdylrwgbbmavxhbdnbwadsawlukjyaxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017567.3961504-75-208912237120879/AnsiballZ_command.py'
Jan 21 17:46:07 compute-0 sudo[45617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:07 compute-0 python3.9[45619]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:46:07 compute-0 sudo[45617]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:09 compute-0 python3.9[45770]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:46:10 compute-0 sudo[45924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asplufpvomwfolkteeqweefrhhcknxeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017570.2316027-115-141352760564399/AnsiballZ_setup.py'
Jan 21 17:46:10 compute-0 sudo[45924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:10 compute-0 python3.9[45926]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:46:11 compute-0 sudo[45924]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:11 compute-0 sudo[46008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgmhrkapcgsxbmkbplacxumcfqmhzzvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017570.2316027-115-141352760564399/AnsiballZ_dnf.py'
Jan 21 17:46:11 compute-0 sudo[46008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:11 compute-0 python3.9[46010]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:46:12 compute-0 sudo[46008]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:14 compute-0 sudo[46161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftflqeduxfrjbuqiavabgsigtamotqvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017573.8608305-139-213369011127573/AnsiballZ_setup.py'
Jan 21 17:46:14 compute-0 sudo[46161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:14 compute-0 python3.9[46163]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:46:14 compute-0 sudo[46161]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:15 compute-0 sudo[46332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkagijljonlxoawshvrdayxbnoyqhnsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017574.9194076-161-63792281715736/AnsiballZ_file.py'
Jan 21 17:46:15 compute-0 sudo[46332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:15 compute-0 python3.9[46334]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:46:15 compute-0 sudo[46332]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:16 compute-0 sudo[46484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpshpfxfockvjbfmlbktqlaiwqhkulmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017575.7933168-177-246360203569069/AnsiballZ_command.py'
Jan 21 17:46:16 compute-0 sudo[46484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:16 compute-0 python3.9[46486]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:46:16 compute-0 podman[46487]: 2026-01-21 17:46:16.251176777 +0000 UTC m=+0.041306349 system refresh
Jan 21 17:46:16 compute-0 sudo[46484]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:16 compute-0 sudo[46646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhkurdexnbpwcrwzzrhwnpkqrjyypfbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017576.6247647-193-174309580736745/AnsiballZ_stat.py'
Jan 21 17:46:16 compute-0 sudo[46646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:17 compute-0 python3.9[46648]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:46:17 compute-0 sudo[46646]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:46:17 compute-0 sudo[46769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hntgokizmcbpnpaxfthsbssetbfpxewu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017576.6247647-193-174309580736745/AnsiballZ_copy.py'
Jan 21 17:46:17 compute-0 sudo[46769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:17 compute-0 python3.9[46771]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017576.6247647-193-174309580736745/.source.json follow=False _original_basename=podman_network_config.j2 checksum=ff957d8c863bf925f3c10df018c68bd793860aa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:46:17 compute-0 sudo[46769]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:18 compute-0 sudo[46921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsksqhvtzsgjtuwllxvnigcazduseesz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017578.067271-223-248388204600724/AnsiballZ_stat.py'
Jan 21 17:46:18 compute-0 sudo[46921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:18 compute-0 python3.9[46923]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:46:18 compute-0 sudo[46921]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:18 compute-0 sudo[47044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgffepdmxozavcbdbjtnaxmrwzylxdji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017578.067271-223-248388204600724/AnsiballZ_copy.py'
Jan 21 17:46:18 compute-0 sudo[47044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:19 compute-0 python3.9[47046]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769017578.067271-223-248388204600724/.source.conf follow=False _original_basename=registries.conf.j2 checksum=8ac2369a44bfbbc0cb814de0283ed73ed2d94205 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:46:19 compute-0 sudo[47044]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:20 compute-0 sudo[47196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnkwlfusjneifpbpmxvebsueinubujnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017579.7233865-255-183382054240998/AnsiballZ_ini_file.py'
Jan 21 17:46:20 compute-0 sudo[47196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:20 compute-0 python3.9[47198]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:46:20 compute-0 sudo[47196]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:20 compute-0 sudo[47348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpshxbeawwxcknhbcutjbskrxjokesgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017580.4642832-255-242825198226601/AnsiballZ_ini_file.py'
Jan 21 17:46:20 compute-0 sudo[47348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:20 compute-0 python3.9[47350]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:46:20 compute-0 sudo[47348]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:21 compute-0 sudo[47500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmfgbxyppafnhydtonrhgnqykpdzrmwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017581.0403771-255-11601471680741/AnsiballZ_ini_file.py'
Jan 21 17:46:21 compute-0 sudo[47500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:21 compute-0 python3.9[47502]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:46:21 compute-0 sudo[47500]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:22 compute-0 sudo[47652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfzslbzmkahviesfpmxosgetvnjvibhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017581.8335161-255-278404590349587/AnsiballZ_ini_file.py'
Jan 21 17:46:22 compute-0 sudo[47652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:22 compute-0 python3.9[47654]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:46:22 compute-0 sudo[47652]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:23 compute-0 python3.9[47804]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:46:24 compute-0 sudo[47956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clkbwfsaorcsikbxdhenovcycmvvmphi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017583.68368-335-167320497831887/AnsiballZ_dnf.py'
Jan 21 17:46:24 compute-0 sudo[47956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:24 compute-0 python3.9[47958]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:46:25 compute-0 sudo[47956]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:26 compute-0 sudo[48109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otojbyopqurzzgmycmxgfdwzorlvsbvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017586.2446022-351-267447457732119/AnsiballZ_dnf.py'
Jan 21 17:46:26 compute-0 sudo[48109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:26 compute-0 python3.9[48111]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:46:28 compute-0 sudo[48109]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:29 compute-0 sudo[48269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pueawetjhcbkrollspwnrjhjjhhwzsdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017589.641825-371-198278798374097/AnsiballZ_dnf.py'
Jan 21 17:46:29 compute-0 sudo[48269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:30 compute-0 python3.9[48271]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:46:31 compute-0 sudo[48269]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:32 compute-0 sudo[48422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pozwcaqbsvpmibcbgwkucynxinbldzow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017592.0567026-389-135022289784875/AnsiballZ_dnf.py'
Jan 21 17:46:32 compute-0 sudo[48422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:32 compute-0 python3.9[48424]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:46:33 compute-0 sudo[48422]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:34 compute-0 sudo[48575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqvxgmrftfbqzruvvnttxznkixojtvio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017594.3507452-411-78150588452583/AnsiballZ_dnf.py'
Jan 21 17:46:34 compute-0 sudo[48575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:34 compute-0 python3.9[48577]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:46:36 compute-0 sudo[48575]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:37 compute-0 sudo[48731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgdmyltxceivbkrxzwyezwiafexrhyid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017596.9672246-427-276383746958563/AnsiballZ_dnf.py'
Jan 21 17:46:37 compute-0 sudo[48731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:37 compute-0 python3.9[48733]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:46:40 compute-0 sudo[48731]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:42 compute-0 sudo[48901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eorkjeriqlpmlxcutxfqgahrhlpxhcfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017602.2049603-445-223574053628986/AnsiballZ_dnf.py'
Jan 21 17:46:42 compute-0 sudo[48901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:42 compute-0 python3.9[48903]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:46:43 compute-0 sudo[48901]: pam_unix(sudo:session): session closed for user root
Jan 21 17:46:44 compute-0 sudo[49054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzuliuwhaueohfanfaushiessqrapzaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017604.3398142-463-113072235392312/AnsiballZ_dnf.py'
Jan 21 17:46:44 compute-0 sudo[49054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:46:44 compute-0 python3.9[49056]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:46:56 compute-0 sudo[49054]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:03 compute-0 sudo[49389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgtvggpeceonfsogkjagijotqtrpagph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017622.940628-481-123010318498813/AnsiballZ_dnf.py'
Jan 21 17:47:03 compute-0 sudo[49389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:03 compute-0 python3.9[49391]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:47:04 compute-0 sudo[49389]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:05 compute-0 sudo[49545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sacygopubxbummgjidtizaugwkldoajq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017625.4111032-501-27799534586489/AnsiballZ_dnf.py'
Jan 21 17:47:05 compute-0 sudo[49545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:05 compute-0 python3.9[49547]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:47:07 compute-0 sudo[49545]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:07 compute-0 sshd-session[49553]: Invalid user polkadot from 64.227.98.100 port 48280
Jan 21 17:47:07 compute-0 sshd-session[49553]: Connection closed by invalid user polkadot 64.227.98.100 port 48280 [preauth]
Jan 21 17:47:08 compute-0 sudo[49704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acmddwfcnjphplfvdgbtnojoipnsxwsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017628.4459696-523-79807127089613/AnsiballZ_file.py'
Jan 21 17:47:08 compute-0 sudo[49704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:08 compute-0 python3.9[49706]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:47:08 compute-0 sudo[49704]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:09 compute-0 sudo[49879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igpabvqamhgwaifqvpmebfnvuzrwogul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017629.126817-539-123936191456144/AnsiballZ_stat.py'
Jan 21 17:47:09 compute-0 sudo[49879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:09 compute-0 python3.9[49881]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:47:09 compute-0 sudo[49879]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:10 compute-0 sudo[50002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djmitenoqwtzzeoxcbkoxtgxcasrrryb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017629.126817-539-123936191456144/AnsiballZ_copy.py'
Jan 21 17:47:10 compute-0 sudo[50002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:10 compute-0 python3.9[50004]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769017629.126817-539-123936191456144/.source.json _original_basename=.t3n4pseu follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:47:10 compute-0 sudo[50002]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:11 compute-0 sudo[50154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyqdkrtjidolppxneekodjbrpjanfuqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017630.6603754-575-263893836893933/AnsiballZ_podman_image.py'
Jan 21 17:47:11 compute-0 sudo[50154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:11 compute-0 python3.9[50156]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 17:47:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2091048440-lower\x2dmapped.mount: Deactivated successfully.
Jan 21 17:47:17 compute-0 podman[50168]: 2026-01-21 17:47:17.235770043 +0000 UTC m=+5.669694652 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 17:47:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:17 compute-0 sudo[50154]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:23 compute-0 sudo[50462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulkvtmoxccncyrrdofaaqbpqvckrfssp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017642.9371583-597-89512669951559/AnsiballZ_podman_image.py'
Jan 21 17:47:23 compute-0 sudo[50462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:23 compute-0 python3.9[50464]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 17:47:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:34 compute-0 podman[50477]: 2026-01-21 17:47:34.428887878 +0000 UTC m=+10.951085143 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 17:47:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:34 compute-0 sudo[50462]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:38 compute-0 sudo[50808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knsnnnxylegmratcxsnewrtfnwzqvlpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017658.4075975-617-80375437696981/AnsiballZ_podman_image.py'
Jan 21 17:47:38 compute-0 sudo[50808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:38 compute-0 python3.9[50810]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 17:47:38 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:51 compute-0 podman[50823]: 2026-01-21 17:47:51.203019352 +0000 UTC m=+12.218809589 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 17:47:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:51 compute-0 sudo[50808]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:53 compute-0 sudo[51076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezgjweanzxwqkqmpxhxqndoxekofydlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017673.056654-639-267370755470254/AnsiballZ_podman_image.py'
Jan 21 17:47:53 compute-0 sudo[51076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:53 compute-0 python3.9[51078]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 17:47:53 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:58 compute-0 podman[51089]: 2026-01-21 17:47:58.238575723 +0000 UTC m=+4.625442773 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 21 17:47:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:47:58 compute-0 sudo[51076]: pam_unix(sudo:session): session closed for user root
Jan 21 17:47:58 compute-0 sudo[51340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfgeeekvixdeaiigptdpervvivaqmpxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017678.5711186-639-28491204714195/AnsiballZ_podman_image.py'
Jan 21 17:47:58 compute-0 sudo[51340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:47:59 compute-0 python3.9[51342]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 17:48:01 compute-0 podman[51355]: 2026-01-21 17:48:01.048847798 +0000 UTC m=+2.006539614 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 21 17:48:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:48:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:48:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:48:01 compute-0 sudo[51340]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:03 compute-0 sshd-session[45159]: Connection closed by 192.168.122.30 port 45732
Jan 21 17:48:03 compute-0 sshd-session[45156]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:48:03 compute-0 systemd-logind[782]: Session 11 logged out. Waiting for processes to exit.
Jan 21 17:48:03 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Jan 21 17:48:03 compute-0 systemd[1]: session-11.scope: Consumed 1min 43.781s CPU time.
Jan 21 17:48:03 compute-0 systemd-logind[782]: Removed session 11.
Jan 21 17:48:08 compute-0 sshd-session[51502]: Accepted publickey for zuul from 192.168.122.30 port 38612 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:48:08 compute-0 systemd-logind[782]: New session 12 of user zuul.
Jan 21 17:48:08 compute-0 systemd[1]: Started Session 12 of User zuul.
Jan 21 17:48:08 compute-0 sshd-session[51502]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:48:09 compute-0 python3.9[51655]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:48:11 compute-0 sudo[51809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpcxqlafdspknxwzwblenskvjfmlyvvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017690.586726-47-71059044645876/AnsiballZ_getent.py'
Jan 21 17:48:11 compute-0 sudo[51809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:11 compute-0 python3.9[51811]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 21 17:48:11 compute-0 sudo[51809]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:11 compute-0 sudo[51962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnlqdvznkqjowwyuvjgucctpeecorhjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017691.4782288-63-37678547805330/AnsiballZ_group.py'
Jan 21 17:48:11 compute-0 sudo[51962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:12 compute-0 python3.9[51964]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 17:48:12 compute-0 groupadd[51965]: group added to /etc/group: name=openvswitch, GID=42476
Jan 21 17:48:12 compute-0 groupadd[51965]: group added to /etc/gshadow: name=openvswitch
Jan 21 17:48:12 compute-0 groupadd[51965]: new group: name=openvswitch, GID=42476
Jan 21 17:48:12 compute-0 sudo[51962]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:12 compute-0 sudo[52120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osupakfkricqbdwjhsqxxdwedpsdyjcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017692.45081-79-212068696241829/AnsiballZ_user.py'
Jan 21 17:48:12 compute-0 sudo[52120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:13 compute-0 python3.9[52122]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 17:48:13 compute-0 useradd[52124]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 17:48:13 compute-0 useradd[52124]: add 'openvswitch' to group 'hugetlbfs'
Jan 21 17:48:13 compute-0 useradd[52124]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 21 17:48:13 compute-0 sudo[52120]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:13 compute-0 sudo[52280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbigwveqbvhblbulbamhfsdovxeuixqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017693.593909-99-81356060736024/AnsiballZ_setup.py'
Jan 21 17:48:13 compute-0 sudo[52280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:14 compute-0 python3.9[52282]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:48:14 compute-0 sudo[52280]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:14 compute-0 sudo[52364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyjwbtagqxxmddhvyioszktqbyyvgjym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017693.593909-99-81356060736024/AnsiballZ_dnf.py'
Jan 21 17:48:14 compute-0 sudo[52364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:15 compute-0 python3.9[52366]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:48:16 compute-0 sudo[52364]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:17 compute-0 sudo[52526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyzuvzwjzphkqgqmdgctcqsjanosfvej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017696.8585384-127-187525356535334/AnsiballZ_dnf.py'
Jan 21 17:48:17 compute-0 sudo[52526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:17 compute-0 python3.9[52528]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:48:30 compute-0 kernel: SELinux:  Converting 2737 SID table entries...
Jan 21 17:48:30 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:48:30 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 17:48:30 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:48:30 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:48:30 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:48:30 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:48:30 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:48:30 compute-0 groupadd[52551]: group added to /etc/group: name=unbound, GID=994
Jan 21 17:48:30 compute-0 groupadd[52551]: group added to /etc/gshadow: name=unbound
Jan 21 17:48:30 compute-0 groupadd[52551]: new group: name=unbound, GID=994
Jan 21 17:48:30 compute-0 useradd[52558]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 21 17:48:30 compute-0 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 21 17:48:30 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 21 17:48:31 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 17:48:31 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 17:48:31 compute-0 systemd[1]: Reloading.
Jan 21 17:48:31 compute-0 systemd-sysv-generator[53061]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:48:31 compute-0 systemd-rc-local-generator[53057]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:48:31 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 17:48:32 compute-0 sudo[52526]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:32 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 17:48:32 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 17:48:32 compute-0 systemd[1]: run-r6dd17089fb764f90b9bfd7b185086f77.service: Deactivated successfully.
Jan 21 17:48:34 compute-0 sudo[53624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azdtjhmbigluurqqgqlxvwjwuxlildne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017714.3942537-143-204473640388657/AnsiballZ_systemd.py'
Jan 21 17:48:34 compute-0 sudo[53624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:35 compute-0 python3.9[53626]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 17:48:35 compute-0 systemd[1]: Reloading.
Jan 21 17:48:35 compute-0 systemd-rc-local-generator[53657]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:48:35 compute-0 systemd-sysv-generator[53661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:48:35 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Jan 21 17:48:35 compute-0 chown[53668]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 21 17:48:35 compute-0 ovs-ctl[53673]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 21 17:48:35 compute-0 ovs-ctl[53673]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 21 17:48:35 compute-0 ovs-ctl[53673]: Starting ovsdb-server [  OK  ]
Jan 21 17:48:35 compute-0 ovs-vsctl[53722]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 21 17:48:35 compute-0 ovs-vsctl[53742]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"a4db021d-a451-4e5f-8011-49af760bda68\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 21 17:48:35 compute-0 ovs-ctl[53673]: Configuring Open vSwitch system IDs [  OK  ]
Jan 21 17:48:35 compute-0 ovs-vsctl[53747]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 21 17:48:35 compute-0 ovs-ctl[53673]: Enabling remote OVSDB managers [  OK  ]
Jan 21 17:48:35 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Jan 21 17:48:35 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 21 17:48:35 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 21 17:48:35 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 21 17:48:36 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Jan 21 17:48:36 compute-0 ovs-ctl[53792]: Inserting openvswitch module [  OK  ]
Jan 21 17:48:36 compute-0 ovs-ctl[53761]: Starting ovs-vswitchd [  OK  ]
Jan 21 17:48:36 compute-0 ovs-vsctl[53809]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 21 17:48:36 compute-0 ovs-ctl[53761]: Enabling remote OVSDB managers [  OK  ]
Jan 21 17:48:36 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 21 17:48:36 compute-0 systemd[1]: Starting Open vSwitch...
Jan 21 17:48:36 compute-0 systemd[1]: Finished Open vSwitch.
Jan 21 17:48:36 compute-0 sudo[53624]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:37 compute-0 python3.9[53961]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:48:37 compute-0 sudo[54111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drwdomylqtreyxyloojjcoltwkfxfnrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017717.4096699-179-134621990473600/AnsiballZ_sefcontext.py'
Jan 21 17:48:37 compute-0 sudo[54111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:38 compute-0 python3.9[54113]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 21 17:48:39 compute-0 kernel: SELinux:  Converting 2751 SID table entries...
Jan 21 17:48:39 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:48:39 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 17:48:39 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:48:39 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:48:39 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:48:39 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:48:39 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:48:39 compute-0 sudo[54111]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:40 compute-0 python3.9[54268]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:48:41 compute-0 sudo[54424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irchhaamtbquaclzxbymweipvupaiutp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017721.037522-215-192610433043548/AnsiballZ_dnf.py'
Jan 21 17:48:41 compute-0 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 21 17:48:41 compute-0 sudo[54424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:41 compute-0 python3.9[54426]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:48:42 compute-0 sudo[54424]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:43 compute-0 sudo[54577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhrclkcozbcccyivuknuhrhxqybywhpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017723.1167972-231-45566398561535/AnsiballZ_command.py'
Jan 21 17:48:43 compute-0 sudo[54577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:43 compute-0 python3.9[54579]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:48:44 compute-0 sudo[54577]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:45 compute-0 sudo[54864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukpvqxxlufglhjavlqcjjmfbtnlmjcjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017724.6839237-247-231828018740579/AnsiballZ_file.py'
Jan 21 17:48:45 compute-0 sudo[54864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:45 compute-0 python3.9[54866]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 21 17:48:45 compute-0 sudo[54864]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:46 compute-0 python3.9[55016]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:48:46 compute-0 sudo[55168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbeusuocpvunkretiitrzkqcskrxkegy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017726.4497123-279-247753262766798/AnsiballZ_dnf.py'
Jan 21 17:48:46 compute-0 sudo[55168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:46 compute-0 python3.9[55170]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:48:48 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 17:48:48 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 17:48:48 compute-0 systemd[1]: Reloading.
Jan 21 17:48:48 compute-0 systemd-sysv-generator[55213]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:48:48 compute-0 systemd-rc-local-generator[55210]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:48:49 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 17:48:49 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 17:48:49 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 17:48:49 compute-0 systemd[1]: run-r9a69d110809a4ce8a5f39e1210b24e7b.service: Deactivated successfully.
Jan 21 17:48:49 compute-0 sudo[55168]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:50 compute-0 sudo[55486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydlhuotkesbehskrasnrsckavdtiyqvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017729.7785487-295-119858844214012/AnsiballZ_systemd.py'
Jan 21 17:48:50 compute-0 sudo[55486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:50 compute-0 python3.9[55488]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 17:48:50 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 21 17:48:50 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Jan 21 17:48:50 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Jan 21 17:48:50 compute-0 systemd[1]: Stopping Network Manager...
Jan 21 17:48:50 compute-0 NetworkManager[7195]: <info>  [1769017730.3921] caught SIGTERM, shutting down normally.
Jan 21 17:48:50 compute-0 NetworkManager[7195]: <info>  [1769017730.3933] dhcp4 (eth0): canceled DHCP transaction
Jan 21 17:48:50 compute-0 NetworkManager[7195]: <info>  [1769017730.3934] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:48:50 compute-0 NetworkManager[7195]: <info>  [1769017730.3934] dhcp4 (eth0): state changed no lease
Jan 21 17:48:50 compute-0 NetworkManager[7195]: <info>  [1769017730.3935] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 17:48:50 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:48:50 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:48:50 compute-0 NetworkManager[7195]: <info>  [1769017730.5730] exiting (success)
Jan 21 17:48:50 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 21 17:48:50 compute-0 systemd[1]: Stopped Network Manager.
Jan 21 17:48:50 compute-0 systemd[1]: NetworkManager.service: Consumed 10.141s CPU time, 4.1M memory peak, read 0B from disk, written 31.0K to disk.
Jan 21 17:48:50 compute-0 systemd[1]: Starting Network Manager...
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.6688] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:23f14126-aa07-49ac-87bf-97b3c4c8c82d)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.6689] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.6753] manager[0x565230548000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 17:48:50 compute-0 systemd[1]: Starting Hostname Service...
Jan 21 17:48:50 compute-0 systemd[1]: Started Hostname Service.
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7813] hostname: hostname: using hostnamed
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7815] hostname: static hostname changed from (none) to "compute-0"
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7820] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7825] manager[0x565230548000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7826] manager[0x565230548000]: rfkill: WWAN hardware radio set enabled
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7843] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7851] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7852] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7852] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7852] manager: Networking is enabled by state file
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7856] settings: Loaded settings plugin: keyfile (internal)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7859] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7878] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7884] dhcp: init: Using DHCP client 'internal'
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7887] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7890] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7894] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7900] device (lo): Activation: starting connection 'lo' (5c2c2fbe-fb9d-4e67-b102-f58e14318d32)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7905] device (eth0): carrier: link connected
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7908] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7911] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7912] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7916] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7920] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7924] device (eth1): carrier: link connected
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7927] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7930] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (811a0e92-136b-562e-ae39-d602e48987d5) (indicated)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7930] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7934] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7938] device (eth1): Activation: starting connection 'ci-private-network' (811a0e92-136b-562e-ae39-d602e48987d5)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7943] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 17:48:50 compute-0 systemd[1]: Started Network Manager.
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7948] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7950] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7956] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7958] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7959] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7961] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7963] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7966] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.7999] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8003] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8018] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8034] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8042] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8047] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8052] device (lo): Activation: successful, device activated.
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8059] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8066] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 17:48:50 compute-0 systemd[1]: Starting Network Manager Wait Online...
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8145] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8151] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8153] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8157] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8160] device (eth1): Activation: successful, device activated.
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8180] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8181] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8184] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8187] device (eth0): Activation: successful, device activated.
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8192] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 17:48:50 compute-0 NetworkManager[55506]: <info>  [1769017730.8194] manager: startup complete
Jan 21 17:48:50 compute-0 systemd[1]: Finished Network Manager Wait Online.
Jan 21 17:48:50 compute-0 sudo[55486]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:51 compute-0 sudo[55712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwtqnwnjqaefdnmjbknhhjkuzvtifqzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017730.991293-311-173646066637316/AnsiballZ_dnf.py'
Jan 21 17:48:51 compute-0 sudo[55712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:51 compute-0 python3.9[55714]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:48:56 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 17:48:56 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 17:48:56 compute-0 systemd[1]: Reloading.
Jan 21 17:48:56 compute-0 systemd-rc-local-generator[55761]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:48:56 compute-0 systemd-sysv-generator[55768]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:48:56 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 17:48:57 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 17:48:57 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 17:48:57 compute-0 systemd[1]: run-r81a78a0b1f0f4d1cb82c197d12491c36.service: Deactivated successfully.
Jan 21 17:48:57 compute-0 sudo[55712]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:58 compute-0 sudo[56172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwfcnlimcrdpmxghamornlelwxvbfxdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017738.5305538-335-212010460823595/AnsiballZ_stat.py'
Jan 21 17:48:58 compute-0 sudo[56172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:59 compute-0 python3.9[56174]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:48:59 compute-0 sudo[56172]: pam_unix(sudo:session): session closed for user root
Jan 21 17:48:59 compute-0 sudo[56324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltqrggslcyrougvdmnhuscjxtiauclxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017739.2846882-353-221187041250029/AnsiballZ_ini_file.py'
Jan 21 17:48:59 compute-0 sudo[56324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:48:59 compute-0 python3.9[56326]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:00 compute-0 sudo[56324]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:00 compute-0 sudo[56478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdskiurwfrlqfbfjtqjqbvprrfezstkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017740.4512014-373-137713549109411/AnsiballZ_ini_file.py'
Jan 21 17:49:00 compute-0 sudo[56478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:00 compute-0 python3.9[56480]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:00 compute-0 sudo[56478]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:00 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:49:01 compute-0 sudo[56630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iedaltqoowfccujqaefnnmpaveftdxhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017741.0438952-373-164937755558454/AnsiballZ_ini_file.py'
Jan 21 17:49:01 compute-0 sudo[56630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:01 compute-0 python3.9[56632]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:01 compute-0 sudo[56630]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:02 compute-0 sudo[56782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eokxpmzjrvbaeofocfvhjjwfgtazcygq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017741.864893-403-110907564326168/AnsiballZ_ini_file.py'
Jan 21 17:49:02 compute-0 sudo[56782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:02 compute-0 python3.9[56784]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:02 compute-0 sudo[56782]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:02 compute-0 sudo[56934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkwakbzrlzojeawgqvxcgkgmfjlewyzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017742.5631082-403-47114982209076/AnsiballZ_ini_file.py'
Jan 21 17:49:02 compute-0 sudo[56934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:02 compute-0 python3.9[56936]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:02 compute-0 sudo[56934]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:03 compute-0 sudo[57086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srlqbkwykecgdbfvptjhcefdybibtssy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017743.2885807-433-172052252729521/AnsiballZ_stat.py'
Jan 21 17:49:03 compute-0 sudo[57086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:03 compute-0 python3.9[57088]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:49:03 compute-0 sudo[57086]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:04 compute-0 sudo[57209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzlkpaeuqrulcjmjjhvecjsvyyvbumjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017743.2885807-433-172052252729521/AnsiballZ_copy.py'
Jan 21 17:49:04 compute-0 sudo[57209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:04 compute-0 python3.9[57211]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017743.2885807-433-172052252729521/.source _original_basename=.vxp6zch6 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:04 compute-0 sudo[57209]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:04 compute-0 sudo[57361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzqblcburnpomykirqoknvwbiahgxpke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017744.5912669-463-77051702218989/AnsiballZ_file.py'
Jan 21 17:49:04 compute-0 sudo[57361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:05 compute-0 python3.9[57363]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:05 compute-0 sudo[57361]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:05 compute-0 sudo[57513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbbgsdgypzoftpwvirenmtxagxmvvdyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017745.4453743-479-204445597306807/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 21 17:49:05 compute-0 sudo[57513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:06 compute-0 python3.9[57515]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 21 17:49:06 compute-0 sudo[57513]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:06 compute-0 sudo[57665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrnbhvyyrlhokfrkuinnhnhvltcgektc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017746.2820332-497-89230686428833/AnsiballZ_file.py'
Jan 21 17:49:06 compute-0 sudo[57665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:06 compute-0 python3.9[57667]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:06 compute-0 sudo[57665]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:07 compute-0 sudo[57817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjnnrvxnppejyaytbwvivurvitfjsjck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017747.0047085-517-218999384586521/AnsiballZ_stat.py'
Jan 21 17:49:07 compute-0 sudo[57817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:07 compute-0 sudo[57817]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:07 compute-0 sudo[57940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eurpxmdnuebwcldqgtieehndsevmerhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017747.0047085-517-218999384586521/AnsiballZ_copy.py'
Jan 21 17:49:07 compute-0 sudo[57940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:07 compute-0 sudo[57940]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:08 compute-0 sudo[58092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crhpjesiljxhqkayppssneoayohuhtyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017748.3114839-547-220242097424277/AnsiballZ_slurp.py'
Jan 21 17:49:08 compute-0 sudo[58092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:08 compute-0 python3.9[58094]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 21 17:49:08 compute-0 sudo[58092]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:09 compute-0 sudo[58267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tistweyizcmyjddqsxptkngcifyhajti ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017749.2414398-565-93028694203632/async_wrapper.py j431123851889 300 /home/zuul/.ansible/tmp/ansible-tmp-1769017749.2414398-565-93028694203632/AnsiballZ_edpm_os_net_config.py _'
Jan 21 17:49:09 compute-0 sudo[58267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:10 compute-0 ansible-async_wrapper.py[58269]: Invoked with j431123851889 300 /home/zuul/.ansible/tmp/ansible-tmp-1769017749.2414398-565-93028694203632/AnsiballZ_edpm_os_net_config.py _
Jan 21 17:49:10 compute-0 ansible-async_wrapper.py[58272]: Starting module and watcher
Jan 21 17:49:10 compute-0 ansible-async_wrapper.py[58272]: Start watching 58273 (300)
Jan 21 17:49:10 compute-0 ansible-async_wrapper.py[58273]: Start module (58273)
Jan 21 17:49:10 compute-0 ansible-async_wrapper.py[58269]: Return async_wrapper task started.
Jan 21 17:49:10 compute-0 sudo[58267]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:10 compute-0 python3.9[58274]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 21 17:49:10 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 21 17:49:10 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 21 17:49:10 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 21 17:49:10 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 21 17:49:10 compute-0 kernel: cfg80211: failed to load regulatory.db
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8170] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8194] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8726] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8728] audit: op="connection-add" uuid="5b092bc7-cfaa-4db9-a2d5-d734c951111a" name="br-ex-br" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8750] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8752] audit: op="connection-add" uuid="9d18c31c-f01a-4f82-821d-39a5ae2526fd" name="br-ex-port" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8769] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8770] audit: op="connection-add" uuid="7fea5462-0d84-42ac-a02c-07fbbb3782c1" name="eth1-port" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8786] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8788] audit: op="connection-add" uuid="aa42167b-55cf-4bf6-a57f-fa5cd8653a93" name="vlan20-port" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8803] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8804] audit: op="connection-add" uuid="d9eaf286-c799-4ac8-9430-d5f8eb2d3b70" name="vlan21-port" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8817] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8819] audit: op="connection-add" uuid="00ac9af1-a72a-4db1-aac7-26fffa0cb22f" name="vlan22-port" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8841] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8861] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8862] audit: op="connection-add" uuid="84c2380a-5ae2-418e-9274-0edbe6ef0951" name="br-ex-if" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8925] audit: op="connection-update" uuid="811a0e92-136b-562e-ae39-d602e48987d5" name="ci-private-network" args="ipv6.method,ipv6.dns,ipv6.routes,ipv6.addresses,ipv6.addr-gen-mode,ipv6.routing-rules,ovs-external-ids.data,connection.port-type,connection.master,connection.slave-type,connection.controller,connection.timestamp,ipv4.dns,ipv4.routes,ipv4.routing-rules,ipv4.addresses,ipv4.method,ipv4.never-default,ovs-interface.type" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8942] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8944] audit: op="connection-add" uuid="4a6a8d16-358e-43dc-b063-095fe2384c00" name="vlan20-if" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8958] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8959] audit: op="connection-add" uuid="828ff2a9-64ce-40ae-bc64-63e572efdbef" name="vlan21-if" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8973] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8974] audit: op="connection-add" uuid="fdc2a198-9f9f-4ee6-9367-bda68b74659b" name="vlan22-if" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8987] audit: op="connection-delete" uuid="3cb94cdd-a1f2-3cfa-b2b4-788809cba9e0" name="Wired connection 1" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.8999] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9002] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9010] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9013] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (5b092bc7-cfaa-4db9-a2d5-d734c951111a)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9014] audit: op="connection-activate" uuid="5b092bc7-cfaa-4db9-a2d5-d734c951111a" name="br-ex-br" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9016] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9016] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9021] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9026] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (9d18c31c-f01a-4f82-821d-39a5ae2526fd)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9027] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9028] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9032] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9035] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (7fea5462-0d84-42ac-a02c-07fbbb3782c1)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9037] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9037] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9041] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9044] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (aa42167b-55cf-4bf6-a57f-fa5cd8653a93)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9045] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9046] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9049] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9053] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (d9eaf286-c799-4ac8-9430-d5f8eb2d3b70)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9054] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9055] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9059] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9063] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (00ac9af1-a72a-4db1-aac7-26fffa0cb22f)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9063] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9065] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9067] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9072] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9073] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9075] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9079] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (84c2380a-5ae2-418e-9274-0edbe6ef0951)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9079] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9081] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9083] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9083] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9084] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9092] device (eth1): disconnecting for new activation request.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9092] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9149] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9151] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9152] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9155] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9156] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9160] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9165] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (4a6a8d16-358e-43dc-b063-095fe2384c00)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9166] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9170] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9172] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9173] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9176] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9179] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9182] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9187] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (828ff2a9-64ce-40ae-bc64-63e572efdbef)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9188] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9192] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9194] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9195] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9198] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <warn>  [1769017751.9199] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9204] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9209] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (fdc2a198-9f9f-4ee6-9367-bda68b74659b)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9209] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9211] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9213] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9214] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9215] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9227] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9229] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9232] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9233] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9240] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9243] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9247] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9250] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9252] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9255] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9259] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9262] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 kernel: ovs-system: entered promiscuous mode
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9274] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9279] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9282] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9286] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9287] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9291] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 kernel: Timeout policy base is empty
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9295] dhcp4 (eth0): canceled DHCP transaction
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9295] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9296] dhcp4 (eth0): state changed no lease
Jan 21 17:49:11 compute-0 systemd-udevd[58280]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9297] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9308] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9310] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58275 uid=0 result="fail" reason="Device is not activated"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9313] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 21 17:49:11 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9346] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9352] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9359] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9483] device (eth1): Activation: starting connection 'ci-private-network' (811a0e92-136b-562e-ae39-d602e48987d5)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9487] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9489] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9496] device (eth1): disconnecting for new activation request.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9496] audit: op="connection-activate" uuid="811a0e92-136b-562e-ae39-d602e48987d5" name="ci-private-network" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9501] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9509] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9514] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9558] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9559] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9560] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9562] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9563] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9572] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9576] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9581] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9586] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9589] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9593] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9597] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9602] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9605] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9609] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9613] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9620] device (eth1): Activation: starting connection 'ci-private-network' (811a0e92-136b-562e-ae39-d602e48987d5)
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9623] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58275 uid=0 result="success"
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9626] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9629] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9638] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 kernel: br-ex: entered promiscuous mode
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9709] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 21 17:49:11 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9717] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9737] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9739] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9743] device (eth1): Activation: successful, device activated.
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9792] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9803] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9839] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 kernel: vlan22: entered promiscuous mode
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9843] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:11 compute-0 NetworkManager[55506]: <info>  [1769017751.9849] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 17:49:11 compute-0 kernel: vlan20: entered promiscuous mode
Jan 21 17:49:11 compute-0 systemd-udevd[58279]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:49:11 compute-0 kernel: vlan21: entered promiscuous mode
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0370] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0377] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0384] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0414] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0421] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0428] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0448] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0449] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0454] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0463] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0464] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0465] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0469] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0474] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:49:12 compute-0 NetworkManager[55506]: <info>  [1769017752.0478] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.1712] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58275 uid=0 result="success"
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.2886] checkpoint[0x56523051d950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.2888] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58275 uid=0 result="success"
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.5213] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58275 uid=0 result="success"
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.5223] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58275 uid=0 result="success"
Jan 21 17:49:13 compute-0 sudo[58612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egexqoeueshxzyawpdwgqcksbcbnhvzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017753.1776075-565-61284015777660/AnsiballZ_async_status.py'
Jan 21 17:49:13 compute-0 sudo[58612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.7178] audit: op="networking-control" arg="global-dns-configuration" pid=58275 uid=0 result="success"
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.7208] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.7240] audit: op="networking-control" arg="global-dns-configuration" pid=58275 uid=0 result="success"
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.7770] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58275 uid=0 result="success"
Jan 21 17:49:13 compute-0 python3.9[58614]: ansible-ansible.legacy.async_status Invoked with jid=j431123851889.58269 mode=status _async_dir=/root/.ansible_async
Jan 21 17:49:13 compute-0 sudo[58612]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.9044] checkpoint[0x56523051da20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 21 17:49:13 compute-0 NetworkManager[55506]: <info>  [1769017753.9048] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58275 uid=0 result="success"
Jan 21 17:49:13 compute-0 ansible-async_wrapper.py[58273]: Module complete (58273)
Jan 21 17:49:15 compute-0 ansible-async_wrapper.py[58272]: Done in kid B.
Jan 21 17:49:17 compute-0 sudo[58716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqhwdqexukfsqqvvmeaujffkwzlrnvam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017753.1776075-565-61284015777660/AnsiballZ_async_status.py'
Jan 21 17:49:17 compute-0 sudo[58716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:17 compute-0 python3.9[58718]: ansible-ansible.legacy.async_status Invoked with jid=j431123851889.58269 mode=status _async_dir=/root/.ansible_async
Jan 21 17:49:17 compute-0 sudo[58716]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:17 compute-0 sudo[58816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fweirunjbskunwchhxniwjwkbdqsnkim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017753.1776075-565-61284015777660/AnsiballZ_async_status.py'
Jan 21 17:49:17 compute-0 sudo[58816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:17 compute-0 python3.9[58818]: ansible-ansible.legacy.async_status Invoked with jid=j431123851889.58269 mode=cleanup _async_dir=/root/.ansible_async
Jan 21 17:49:17 compute-0 sudo[58816]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:18 compute-0 sudo[58968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvyofyhwpqvfmwyulvjixolprypexedd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017758.0572197-619-243937666221276/AnsiballZ_stat.py'
Jan 21 17:49:18 compute-0 sudo[58968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:18 compute-0 python3.9[58970]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:49:18 compute-0 sudo[58968]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:18 compute-0 sudo[59091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idwqjskonddtanaqsclzalgnhvohfrdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017758.0572197-619-243937666221276/AnsiballZ_copy.py'
Jan 21 17:49:18 compute-0 sudo[59091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:18 compute-0 python3.9[59093]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017758.0572197-619-243937666221276/.source.returncode _original_basename=.bs7_zje6 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:18 compute-0 sudo[59091]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:19 compute-0 sudo[59243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxufjymadtklsxmujiizqsfyyptrybyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017759.2945182-651-146204342706427/AnsiballZ_stat.py'
Jan 21 17:49:19 compute-0 sudo[59243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:19 compute-0 python3.9[59245]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:49:19 compute-0 sudo[59243]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:20 compute-0 sudo[59368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdrgqxpklgyuepnlmvwxbtpleppsxpid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017759.2945182-651-146204342706427/AnsiballZ_copy.py'
Jan 21 17:49:20 compute-0 sudo[59368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:20 compute-0 python3.9[59370]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017759.2945182-651-146204342706427/.source.cfg _original_basename=.dvopfamz follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:20 compute-0 sudo[59368]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:20 compute-0 sshd-session[59339]: Connection closed by authenticating user root 64.227.98.100 port 39204 [preauth]
Jan 21 17:49:20 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 17:49:20 compute-0 sudo[59524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sehhuptiojreotqnernyvymtmrbjfvoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017760.6700099-681-241414836230872/AnsiballZ_systemd.py'
Jan 21 17:49:20 compute-0 sudo[59524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:21 compute-0 python3.9[59526]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 17:49:21 compute-0 systemd[1]: Reloading Network Manager...
Jan 21 17:49:21 compute-0 NetworkManager[55506]: <info>  [1769017761.3158] audit: op="reload" arg="0" pid=59530 uid=0 result="success"
Jan 21 17:49:21 compute-0 NetworkManager[55506]: <info>  [1769017761.3165] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 21 17:49:21 compute-0 systemd[1]: Reloaded Network Manager.
Jan 21 17:49:21 compute-0 sudo[59524]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:21 compute-0 sshd-session[51505]: Connection closed by 192.168.122.30 port 38612
Jan 21 17:49:21 compute-0 sshd-session[51502]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:49:21 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Jan 21 17:49:21 compute-0 systemd[1]: session-12.scope: Consumed 48.975s CPU time.
Jan 21 17:49:21 compute-0 systemd-logind[782]: Session 12 logged out. Waiting for processes to exit.
Jan 21 17:49:21 compute-0 systemd-logind[782]: Removed session 12.
Jan 21 17:49:27 compute-0 sshd-session[59561]: Accepted publickey for zuul from 192.168.122.30 port 44128 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:49:27 compute-0 systemd-logind[782]: New session 13 of user zuul.
Jan 21 17:49:27 compute-0 systemd[1]: Started Session 13 of User zuul.
Jan 21 17:49:27 compute-0 sshd-session[59561]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:49:28 compute-0 python3.9[59714]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:49:29 compute-0 python3.9[59868]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:49:30 compute-0 python3.9[60058]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:49:31 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:49:31 compute-0 sshd-session[59564]: Connection closed by 192.168.122.30 port 44128
Jan 21 17:49:31 compute-0 sshd-session[59561]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:49:31 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Jan 21 17:49:31 compute-0 systemd[1]: session-13.scope: Consumed 2.270s CPU time.
Jan 21 17:49:31 compute-0 systemd-logind[782]: Session 13 logged out. Waiting for processes to exit.
Jan 21 17:49:31 compute-0 systemd-logind[782]: Removed session 13.
Jan 21 17:49:36 compute-0 sshd-session[60087]: Received disconnect from 125.124.24.140 port 51684:11:  [preauth]
Jan 21 17:49:36 compute-0 sshd-session[60087]: Disconnected from authenticating user root 125.124.24.140 port 51684 [preauth]
Jan 21 17:49:37 compute-0 sshd-session[60090]: Accepted publickey for zuul from 192.168.122.30 port 33828 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:49:37 compute-0 systemd-logind[782]: New session 14 of user zuul.
Jan 21 17:49:37 compute-0 systemd[1]: Started Session 14 of User zuul.
Jan 21 17:49:37 compute-0 sshd-session[60090]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:49:38 compute-0 python3.9[60243]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:49:39 compute-0 python3.9[60397]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:49:39 compute-0 sudo[60552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pedatpbpqonffqcayzxkxvuwwuxlvhjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017779.6325176-55-54783114391321/AnsiballZ_setup.py'
Jan 21 17:49:39 compute-0 sudo[60552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:40 compute-0 python3.9[60554]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:49:40 compute-0 sudo[60552]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:41 compute-0 sudo[60636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epegzhsjgebxftgttedtivcgjcnxnxtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017779.6325176-55-54783114391321/AnsiballZ_dnf.py'
Jan 21 17:49:41 compute-0 sudo[60636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:41 compute-0 python3.9[60638]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:49:42 compute-0 sudo[60636]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:43 compute-0 sudo[60790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axvjnqekoosynihjfjinqmenkpujnrjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017782.8836684-79-211729906418727/AnsiballZ_setup.py'
Jan 21 17:49:43 compute-0 sudo[60790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:43 compute-0 python3.9[60792]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:49:43 compute-0 sudo[60790]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:44 compute-0 sudo[60981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drnqgabiebrjbdxnauiwzkfayyukqkjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017784.1644263-101-34988095907448/AnsiballZ_file.py'
Jan 21 17:49:44 compute-0 sudo[60981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:44 compute-0 python3.9[60983]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:44 compute-0 sudo[60981]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:45 compute-0 sudo[61133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlzqmoxmzozdjynzsezlehgfikichvfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017785.0152588-117-255854934924519/AnsiballZ_command.py'
Jan 21 17:49:45 compute-0 sudo[61133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:45 compute-0 python3.9[61135]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:49:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:49:45 compute-0 sudo[61133]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:46 compute-0 sudo[61296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlxjifyqrjthhbccgomohnogyucqhbxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017785.944235-133-51091819271998/AnsiballZ_stat.py'
Jan 21 17:49:46 compute-0 sudo[61296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:46 compute-0 python3.9[61298]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:49:46 compute-0 sudo[61296]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:46 compute-0 sudo[61374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdygomiorbcehxdqixgegfkumreuimqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017785.944235-133-51091819271998/AnsiballZ_file.py'
Jan 21 17:49:46 compute-0 sudo[61374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:47 compute-0 python3.9[61376]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:47 compute-0 sudo[61374]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:47 compute-0 sudo[61526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgzjdntmiqdelwvblhtoeshleyntvdwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017787.4162493-157-167786656245454/AnsiballZ_stat.py'
Jan 21 17:49:47 compute-0 sudo[61526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:47 compute-0 python3.9[61528]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:49:47 compute-0 sudo[61526]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:48 compute-0 sudo[61604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oicteujzamqhhwdvhqwoobtbyaoiijiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017787.4162493-157-167786656245454/AnsiballZ_file.py'
Jan 21 17:49:48 compute-0 sudo[61604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:48 compute-0 python3.9[61606]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:49:48 compute-0 sudo[61604]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:49 compute-0 sudo[61756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gugrwvcfoerqnvaqujluiejqhtunhpnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017788.7663205-183-255947449397891/AnsiballZ_ini_file.py'
Jan 21 17:49:49 compute-0 sudo[61756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:49 compute-0 python3.9[61758]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:49:49 compute-0 sudo[61756]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:49 compute-0 sudo[61908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qudhnswvdyfnznprkjlaczdpadvkvaqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017789.567364-183-156449754907932/AnsiballZ_ini_file.py'
Jan 21 17:49:49 compute-0 sudo[61908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:50 compute-0 python3.9[61910]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:49:50 compute-0 sudo[61908]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:50 compute-0 sudo[62060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmveperodxffovhnswwfxsfqcbjznjoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017790.1515017-183-216289608135760/AnsiballZ_ini_file.py'
Jan 21 17:49:50 compute-0 sudo[62060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:50 compute-0 python3.9[62062]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:49:50 compute-0 sudo[62060]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:51 compute-0 sudo[62212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqlwoaldgvgdanlgbasnfxoyrodyypqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017790.7431374-183-217059720883777/AnsiballZ_ini_file.py'
Jan 21 17:49:51 compute-0 sudo[62212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:51 compute-0 python3.9[62214]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:49:51 compute-0 sudo[62212]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:51 compute-0 sudo[62364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqsadzmaxncodzdamuzecndvgvlqexgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017791.679156-245-232303004370099/AnsiballZ_dnf.py'
Jan 21 17:49:51 compute-0 sudo[62364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:52 compute-0 python3.9[62366]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:49:53 compute-0 sudo[62364]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:54 compute-0 sudo[62517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdxrnbqrudoqemnunbitrfbrezwqdabg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017794.1369839-267-138954206378404/AnsiballZ_setup.py'
Jan 21 17:49:54 compute-0 sudo[62517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:54 compute-0 python3.9[62519]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:49:54 compute-0 sudo[62517]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:55 compute-0 sudo[62671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzndxguoxaimkoyfggmbzdmonkjbbuda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017794.989731-283-167675991135281/AnsiballZ_stat.py'
Jan 21 17:49:55 compute-0 sudo[62671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:55 compute-0 python3.9[62673]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:49:55 compute-0 sudo[62671]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:55 compute-0 sudo[62823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yghnfmvldacrvxwqbyqcbnxpnyrqoxwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017795.7094848-301-38534013671200/AnsiballZ_stat.py'
Jan 21 17:49:55 compute-0 sudo[62823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:56 compute-0 python3.9[62825]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:49:56 compute-0 sudo[62823]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:56 compute-0 sudo[62975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiwhkeopjibgtwibocrcmaxqnypnhkny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017796.4501946-321-88158603172983/AnsiballZ_command.py'
Jan 21 17:49:56 compute-0 sudo[62975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:56 compute-0 python3.9[62977]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:49:56 compute-0 sudo[62975]: pam_unix(sudo:session): session closed for user root
Jan 21 17:49:57 compute-0 sudo[63128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cokcqaewtbrnerngwsxpmemapkqaohnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017797.2645006-341-33920443285546/AnsiballZ_service_facts.py'
Jan 21 17:49:57 compute-0 sudo[63128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:49:57 compute-0 python3.9[63130]: ansible-service_facts Invoked
Jan 21 17:49:57 compute-0 network[63147]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 17:49:57 compute-0 network[63148]: 'network-scripts' will be removed from distribution in near future.
Jan 21 17:49:57 compute-0 network[63149]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 17:50:00 compute-0 sudo[63128]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:01 compute-0 sudo[63432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qllyhoepzkmztkdkuuowducxsvjwdcwr ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769017801.5032172-371-149789746054988/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769017801.5032172-371-149789746054988/args'
Jan 21 17:50:01 compute-0 sudo[63432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:01 compute-0 sudo[63432]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:02 compute-0 sudo[63599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaiiyzitamtbovhdrlvksycsituzgzat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017802.2720397-393-44720196790281/AnsiballZ_dnf.py'
Jan 21 17:50:02 compute-0 sudo[63599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:02 compute-0 python3.9[63601]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:50:04 compute-0 sudo[63599]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:05 compute-0 sudo[63753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvjmcwbncmrdjdcqrdqheayemhtlxetk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017804.6185343-419-237759864711275/AnsiballZ_package_facts.py'
Jan 21 17:50:05 compute-0 sudo[63753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:05 compute-0 python3.9[63755]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 21 17:50:05 compute-0 sudo[63753]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:06 compute-0 sudo[63905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfvkqqcmbynsjkreveyiceanldntixsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017806.3731732-439-68259921345317/AnsiballZ_stat.py'
Jan 21 17:50:06 compute-0 sudo[63905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:06 compute-0 python3.9[63907]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:50:06 compute-0 sudo[63905]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:07 compute-0 sudo[64030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdiaigzwfitbzzczwfoboqcwvoozhlfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017806.3731732-439-68259921345317/AnsiballZ_copy.py'
Jan 21 17:50:07 compute-0 sudo[64030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:07 compute-0 python3.9[64032]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017806.3731732-439-68259921345317/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:50:07 compute-0 sudo[64030]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:08 compute-0 sudo[64184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pugnnlkpyeykcgdljhvycetoozpsdwbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017807.974344-469-220691073544954/AnsiballZ_stat.py'
Jan 21 17:50:08 compute-0 sudo[64184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:08 compute-0 python3.9[64186]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:50:08 compute-0 sudo[64184]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:08 compute-0 sudo[64309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hppcjmhaetcchscurzhjbiqdxhnpcmtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017807.974344-469-220691073544954/AnsiballZ_copy.py'
Jan 21 17:50:08 compute-0 sudo[64309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:09 compute-0 python3.9[64311]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017807.974344-469-220691073544954/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:50:09 compute-0 sudo[64309]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:10 compute-0 sudo[64463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tapilrxodbufxmsvcdlndipyjkkexvor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017809.8438935-511-272517982934858/AnsiballZ_lineinfile.py'
Jan 21 17:50:10 compute-0 sudo[64463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:10 compute-0 python3.9[64465]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:50:10 compute-0 sudo[64463]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:11 compute-0 sudo[64617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owxyifcghkiorfwncarzvwelhkymuqcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017811.3200252-541-71646628256027/AnsiballZ_setup.py'
Jan 21 17:50:11 compute-0 sudo[64617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:11 compute-0 python3.9[64619]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:50:12 compute-0 sudo[64617]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:12 compute-0 sudo[64701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udnjssyahjkbzdfqwhtsltcrruineslw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017811.3200252-541-71646628256027/AnsiballZ_systemd.py'
Jan 21 17:50:12 compute-0 sudo[64701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:12 compute-0 python3.9[64703]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:50:13 compute-0 sudo[64701]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:14 compute-0 sudo[64855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvumowaciopzzzbaknzyjxzvksqryswv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017813.7369118-573-18868726347134/AnsiballZ_setup.py'
Jan 21 17:50:14 compute-0 sudo[64855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:14 compute-0 python3.9[64857]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:50:14 compute-0 sudo[64855]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:14 compute-0 sudo[64939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeuhlblvuuutfjiophntgvyxyyqahlwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017813.7369118-573-18868726347134/AnsiballZ_systemd.py'
Jan 21 17:50:14 compute-0 sudo[64939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:15 compute-0 python3.9[64941]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 17:50:15 compute-0 chronyd[794]: chronyd exiting
Jan 21 17:50:15 compute-0 systemd[1]: Stopping NTP client/server...
Jan 21 17:50:15 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Jan 21 17:50:15 compute-0 systemd[1]: Stopped NTP client/server.
Jan 21 17:50:15 compute-0 systemd[1]: Starting NTP client/server...
Jan 21 17:50:15 compute-0 chronyd[64950]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 21 17:50:15 compute-0 chronyd[64950]: Frequency -23.394 +/- 0.088 ppm read from /var/lib/chrony/drift
Jan 21 17:50:15 compute-0 chronyd[64950]: Loaded seccomp filter (level 2)
Jan 21 17:50:15 compute-0 systemd[1]: Started NTP client/server.
Jan 21 17:50:15 compute-0 sudo[64939]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:16 compute-0 sshd-session[60093]: Connection closed by 192.168.122.30 port 33828
Jan 21 17:50:16 compute-0 sshd-session[60090]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:50:16 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Jan 21 17:50:16 compute-0 systemd[1]: session-14.scope: Consumed 25.208s CPU time.
Jan 21 17:50:16 compute-0 systemd-logind[782]: Session 14 logged out. Waiting for processes to exit.
Jan 21 17:50:16 compute-0 systemd-logind[782]: Removed session 14.
Jan 21 17:50:21 compute-0 sshd-session[64976]: Received disconnect from 39.191.29.114 port 53220:11:  [preauth]
Jan 21 17:50:21 compute-0 sshd-session[64976]: Disconnected from authenticating user root 39.191.29.114 port 53220 [preauth]
Jan 21 17:50:22 compute-0 sshd-session[64978]: Accepted publickey for zuul from 192.168.122.30 port 49638 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:50:22 compute-0 systemd-logind[782]: New session 15 of user zuul.
Jan 21 17:50:22 compute-0 systemd[1]: Started Session 15 of User zuul.
Jan 21 17:50:22 compute-0 sshd-session[64978]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:50:23 compute-0 python3.9[65131]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:50:24 compute-0 sudo[65285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iifcacfhiytnxebkfcztzbnlwqfxncyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017824.134036-41-205376413673349/AnsiballZ_file.py'
Jan 21 17:50:24 compute-0 sudo[65285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:24 compute-0 python3.9[65287]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:50:24 compute-0 sudo[65285]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:25 compute-0 sudo[65460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajlzvjtafdkyppyexvgttlynyhdbmjvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017824.9900177-57-39212637793796/AnsiballZ_stat.py'
Jan 21 17:50:25 compute-0 sudo[65460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:25 compute-0 python3.9[65462]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:50:25 compute-0 sudo[65460]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:25 compute-0 sudo[65538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkqgzkusbiwbntktgilvdcmdpfwrsfuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017824.9900177-57-39212637793796/AnsiballZ_file.py'
Jan 21 17:50:25 compute-0 sudo[65538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:26 compute-0 python3.9[65540]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.gwhq844s recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:50:26 compute-0 sudo[65538]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:26 compute-0 sudo[65690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sndkzfanswgmuvpegzomirxofspffkhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017826.4352417-97-131723290340698/AnsiballZ_stat.py'
Jan 21 17:50:26 compute-0 sudo[65690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:26 compute-0 python3.9[65692]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:50:26 compute-0 sudo[65690]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:27 compute-0 sudo[65813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpasqrrxbbocopcfxwpwscdhrjujdsof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017826.4352417-97-131723290340698/AnsiballZ_copy.py'
Jan 21 17:50:27 compute-0 sudo[65813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:27 compute-0 python3.9[65815]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017826.4352417-97-131723290340698/.source _original_basename=.4jrbws86 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:50:27 compute-0 sudo[65813]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:28 compute-0 sudo[65965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxmxsijfbglrubxlvhqokpnqwgpxomfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017827.858223-129-212635390065141/AnsiballZ_file.py'
Jan 21 17:50:28 compute-0 sudo[65965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:28 compute-0 python3.9[65967]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:50:28 compute-0 sudo[65965]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:28 compute-0 sudo[66117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgtupvstcpnvtbcxvcfhnfovgllbkies ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017828.5752637-145-118059866077563/AnsiballZ_stat.py'
Jan 21 17:50:28 compute-0 sudo[66117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:50:29 compute-0 python3.9[66119]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:50:29 compute-0 sudo[66117]: pam_unix(sudo:session): session closed for user root
Jan 21 17:50:29 compute-0 sudo[66240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pddgjiajagtfolpkatvvqimlpajaewxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017828.5752637-145-118059866077563/AnsiballZ_copy.py'
Jan 21 17:50:29 compute-0 sudo[66240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:51:06 compute-0 python3.9[66242]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769017828.5752637-145-118059866077563/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:51:06 compute-0 sudo[66240]: pam_unix(sudo:session): session closed for user root
Jan 21 17:51:06 compute-0 sudo[66396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtglbzaebzhmuxzmbuledingbuafvbqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017866.4262314-145-98581741329666/AnsiballZ_stat.py'
Jan 21 17:51:06 compute-0 sudo[66396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:51:31 compute-0 sshd-session[66400]: Invalid user user from 64.227.98.100 port 58786
Jan 21 17:51:31 compute-0 sshd-session[66400]: Connection closed by invalid user user 64.227.98.100 port 58786 [preauth]
Jan 21 17:52:02 compute-0 python3.9[66398]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:02 compute-0 sudo[66396]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:02 compute-0 sudo[66522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izetkpnqumppzapxanzefytzthrbggmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017866.4262314-145-98581741329666/AnsiballZ_copy.py'
Jan 21 17:52:02 compute-0 sudo[66522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:02 compute-0 python3.9[66524]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769017866.4262314-145-98581741329666/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:52:02 compute-0 sudo[66522]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:03 compute-0 sudo[66674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbyxctjijwpmbnwmoeujkvaerfwpjyez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017922.8436615-203-71934925374846/AnsiballZ_file.py'
Jan 21 17:52:03 compute-0 sudo[66674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:03 compute-0 python3.9[66676]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:03 compute-0 sudo[66674]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:03 compute-0 sudo[66826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akfewihogozdllwkumcjxomsbnnggprw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017923.6364307-222-202198911943282/AnsiballZ_stat.py'
Jan 21 17:52:03 compute-0 sudo[66826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:04 compute-0 python3.9[66828]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:04 compute-0 sudo[66826]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:04 compute-0 sudo[66949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxhniijyolkeyhwojlufndxcilfnyfcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017923.6364307-222-202198911943282/AnsiballZ_copy.py'
Jan 21 17:52:04 compute-0 sudo[66949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:04 compute-0 python3.9[66951]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017923.6364307-222-202198911943282/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:04 compute-0 sudo[66949]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:05 compute-0 sudo[67101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrdxeoxtlquoqsmxilslxzvqezmvqqob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017924.8631754-252-251376526675497/AnsiballZ_stat.py'
Jan 21 17:52:05 compute-0 sudo[67101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:05 compute-0 python3.9[67103]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:05 compute-0 sudo[67101]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:05 compute-0 sudo[67224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccaxdqdxahysyfepuaiezxekibdjkudu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017924.8631754-252-251376526675497/AnsiballZ_copy.py'
Jan 21 17:52:05 compute-0 sudo[67224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:05 compute-0 python3.9[67226]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017924.8631754-252-251376526675497/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:05 compute-0 sudo[67224]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:06 compute-0 sudo[67376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilhdcyphtputcqwrjjouzoeapqqtveuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017926.1344323-282-262152121905495/AnsiballZ_systemd.py'
Jan 21 17:52:06 compute-0 sudo[67376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:07 compute-0 python3.9[67378]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:52:07 compute-0 systemd[1]: Reloading.
Jan 21 17:52:07 compute-0 systemd-rc-local-generator[67402]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:52:07 compute-0 systemd-sysv-generator[67410]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:52:07 compute-0 systemd[1]: Reloading.
Jan 21 17:52:07 compute-0 systemd-sysv-generator[67445]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:52:07 compute-0 systemd-rc-local-generator[67442]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:52:07 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Jan 21 17:52:07 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Jan 21 17:52:07 compute-0 sudo[67376]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:08 compute-0 sudo[67604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqlhytabcyrzogjqpjvcdedvjmermwui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017927.867508-298-59640453688412/AnsiballZ_stat.py'
Jan 21 17:52:08 compute-0 sudo[67604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:08 compute-0 python3.9[67606]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:08 compute-0 sudo[67604]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:08 compute-0 sudo[67727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgqscxosiaidfasvsexqddbmsirsndev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017927.867508-298-59640453688412/AnsiballZ_copy.py'
Jan 21 17:52:08 compute-0 sudo[67727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:08 compute-0 python3.9[67729]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017927.867508-298-59640453688412/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:08 compute-0 sudo[67727]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:09 compute-0 sudo[67879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqddkvysqksdpfjshaponxzrnmbambsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017929.071312-328-94896265826983/AnsiballZ_stat.py'
Jan 21 17:52:09 compute-0 sudo[67879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:09 compute-0 python3.9[67881]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:09 compute-0 sudo[67879]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:09 compute-0 sudo[68002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sibnktinpunikvhfwrvuhujxewmfrbcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017929.071312-328-94896265826983/AnsiballZ_copy.py'
Jan 21 17:52:09 compute-0 sudo[68002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:10 compute-0 python3.9[68004]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017929.071312-328-94896265826983/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:10 compute-0 sudo[68002]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:10 compute-0 sudo[68154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqsnhxmgoyojenqiqukodqwbtqkjfxhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017930.3240337-358-136950095384472/AnsiballZ_systemd.py'
Jan 21 17:52:10 compute-0 sudo[68154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:10 compute-0 python3.9[68156]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:52:10 compute-0 systemd[1]: Reloading.
Jan 21 17:52:10 compute-0 systemd-rc-local-generator[68185]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:52:10 compute-0 systemd-sysv-generator[68189]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:52:11 compute-0 systemd[1]: Reloading.
Jan 21 17:52:11 compute-0 systemd-rc-local-generator[68221]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:52:11 compute-0 systemd-sysv-generator[68225]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:52:11 compute-0 systemd[1]: Starting Create netns directory...
Jan 21 17:52:11 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 17:52:11 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 17:52:11 compute-0 systemd[1]: Finished Create netns directory.
Jan 21 17:52:11 compute-0 sudo[68154]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:12 compute-0 python3.9[68382]: ansible-ansible.builtin.service_facts Invoked
Jan 21 17:52:12 compute-0 network[68399]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 17:52:12 compute-0 network[68400]: 'network-scripts' will be removed from distribution in near future.
Jan 21 17:52:12 compute-0 network[68401]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 17:52:16 compute-0 sudo[68661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iikwkghtjfzvwcmqdjuqebrfglvqpnuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017936.2861056-390-112760796488350/AnsiballZ_systemd.py'
Jan 21 17:52:16 compute-0 sudo[68661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:16 compute-0 python3.9[68663]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:52:16 compute-0 systemd[1]: Reloading.
Jan 21 17:52:16 compute-0 systemd-rc-local-generator[68693]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:52:16 compute-0 systemd-sysv-generator[68696]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:52:17 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 21 17:52:17 compute-0 iptables.init[68703]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 21 17:52:17 compute-0 iptables.init[68703]: iptables: Flushing firewall rules: [  OK  ]
Jan 21 17:52:17 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Jan 21 17:52:17 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 21 17:52:17 compute-0 sudo[68661]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:17 compute-0 sudo[68898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oniteeecudtougsfdlzegvrgynbrqkkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017937.6170251-390-74556819539126/AnsiballZ_systemd.py'
Jan 21 17:52:17 compute-0 sudo[68898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:18 compute-0 python3.9[68900]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:52:18 compute-0 sudo[68898]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:18 compute-0 sudo[69052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykfcvkilsmiturjcpwbygxcpubtfxiry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017938.6447103-422-79996577828797/AnsiballZ_systemd.py'
Jan 21 17:52:18 compute-0 sudo[69052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:19 compute-0 python3.9[69054]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:52:19 compute-0 systemd[1]: Reloading.
Jan 21 17:52:19 compute-0 systemd-rc-local-generator[69084]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:52:19 compute-0 systemd-sysv-generator[69088]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:52:19 compute-0 systemd[1]: Starting Netfilter Tables...
Jan 21 17:52:19 compute-0 systemd[1]: Finished Netfilter Tables.
Jan 21 17:52:19 compute-0 sudo[69052]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:21 compute-0 sudo[69244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjnobcgcmrwhdhmhvcicfgmmibbyeedn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017940.6850321-438-124188971934861/AnsiballZ_command.py'
Jan 21 17:52:21 compute-0 sudo[69244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:21 compute-0 python3.9[69246]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:52:21 compute-0 sudo[69244]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:22 compute-0 sudo[69397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neldglylsmvcuiluloapqxulkmdxoftz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017942.0362422-466-137169737937749/AnsiballZ_stat.py'
Jan 21 17:52:22 compute-0 sudo[69397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:22 compute-0 python3.9[69399]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:22 compute-0 sudo[69397]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:22 compute-0 sudo[69522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avycogcjhwrwxvgkccfckdnvsdpdixck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017942.0362422-466-137169737937749/AnsiballZ_copy.py'
Jan 21 17:52:22 compute-0 sudo[69522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:23 compute-0 python3.9[69524]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017942.0362422-466-137169737937749/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:23 compute-0 sudo[69522]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:23 compute-0 sudo[69675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxvgdxgssxankyawnpvxsjzvwhrrhlqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017943.334888-496-150624140409591/AnsiballZ_systemd.py'
Jan 21 17:52:23 compute-0 sudo[69675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:23 compute-0 python3.9[69677]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 17:52:23 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Jan 21 17:52:23 compute-0 sshd[1003]: Received SIGHUP; restarting.
Jan 21 17:52:23 compute-0 sshd[1003]: Server listening on 0.0.0.0 port 22.
Jan 21 17:52:23 compute-0 sshd[1003]: Server listening on :: port 22.
Jan 21 17:52:23 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Jan 21 17:52:23 compute-0 sudo[69675]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:24 compute-0 chronyd[64950]: Selected source 206.108.0.131 (pool.ntp.org)
Jan 21 17:52:24 compute-0 sudo[69831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sudogkovllzfyttaxhinqmudyygknkht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017944.1666217-512-113578765683998/AnsiballZ_file.py'
Jan 21 17:52:24 compute-0 sudo[69831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:24 compute-0 python3.9[69833]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:24 compute-0 sudo[69831]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:25 compute-0 sudo[69983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iefipriybmqpgcpwgnweexzuvlolswpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017944.8917553-528-236641355461072/AnsiballZ_stat.py'
Jan 21 17:52:25 compute-0 sudo[69983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:25 compute-0 python3.9[69985]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:25 compute-0 sudo[69983]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:25 compute-0 sudo[70106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytepeaslctassmeepjqqscomzppubkfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017944.8917553-528-236641355461072/AnsiballZ_copy.py'
Jan 21 17:52:25 compute-0 sudo[70106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:25 compute-0 python3.9[70108]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017944.8917553-528-236641355461072/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:25 compute-0 sudo[70106]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:26 compute-0 sudo[70258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnyzppwnkbnsbwdnjnmnakoglmbcfqfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017946.3169594-564-173072443419565/AnsiballZ_timezone.py'
Jan 21 17:52:26 compute-0 sudo[70258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:27 compute-0 python3.9[70260]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 21 17:52:27 compute-0 systemd[1]: Starting Time & Date Service...
Jan 21 17:52:27 compute-0 systemd[1]: Started Time & Date Service.
Jan 21 17:52:27 compute-0 sudo[70258]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:27 compute-0 sudo[70414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gryraqsuxrxbwntnvgcknsgcriigkjud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017947.5767097-582-4848726204290/AnsiballZ_file.py'
Jan 21 17:52:27 compute-0 sudo[70414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:28 compute-0 python3.9[70416]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:28 compute-0 sudo[70414]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:28 compute-0 sudo[70566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzzolxyugzjxetsqoqtpqrwejmernhiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017948.270369-598-67199322987156/AnsiballZ_stat.py'
Jan 21 17:52:28 compute-0 sudo[70566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:28 compute-0 python3.9[70568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:28 compute-0 sudo[70566]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:29 compute-0 sudo[70689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyzobqvhbujerdqgpxasikbwjehukrwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017948.270369-598-67199322987156/AnsiballZ_copy.py'
Jan 21 17:52:29 compute-0 sudo[70689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:29 compute-0 python3.9[70691]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017948.270369-598-67199322987156/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:29 compute-0 sudo[70689]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:29 compute-0 sudo[70841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqnnuiqewooyljkmhrszmgxbltujragz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017949.666218-628-281374650252445/AnsiballZ_stat.py'
Jan 21 17:52:29 compute-0 sudo[70841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:30 compute-0 python3.9[70843]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:30 compute-0 sudo[70841]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:30 compute-0 sudo[70964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzdnjcixtzmfbooymijvaqlpudzyujdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017949.666218-628-281374650252445/AnsiballZ_copy.py'
Jan 21 17:52:30 compute-0 sudo[70964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:30 compute-0 python3.9[70966]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769017949.666218-628-281374650252445/.source.yaml _original_basename=.j975mezt follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:30 compute-0 sudo[70964]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:31 compute-0 sudo[71116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngvqtstdbljqkrlgoedgqvgmxjigimak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017951.0175166-658-174690016181072/AnsiballZ_stat.py'
Jan 21 17:52:31 compute-0 sudo[71116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:31 compute-0 python3.9[71118]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:31 compute-0 sudo[71116]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:31 compute-0 sudo[71239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aycmcuirwlpwwvxnmbpaupsupkggdmzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017951.0175166-658-174690016181072/AnsiballZ_copy.py'
Jan 21 17:52:31 compute-0 sudo[71239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:32 compute-0 python3.9[71241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017951.0175166-658-174690016181072/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:32 compute-0 sudo[71239]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:32 compute-0 sudo[71391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtuwxhlptufhiftpotjsmuecpyxbvybn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017952.338925-688-11139066305870/AnsiballZ_command.py'
Jan 21 17:52:32 compute-0 sudo[71391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:32 compute-0 python3.9[71393]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:52:32 compute-0 sudo[71391]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:33 compute-0 sudo[71545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxwzslvwvdlqpbcofocpskpyqxdbqjua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017953.0176623-704-247231824168265/AnsiballZ_command.py'
Jan 21 17:52:33 compute-0 sudo[71545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:33 compute-0 python3.9[71547]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:52:33 compute-0 sudo[71545]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:34 compute-0 sudo[71698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbrqtlscvknphqvifjexydbmrgqcbydi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769017953.6962202-720-150473448883885/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 17:52:34 compute-0 sudo[71698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:34 compute-0 python3[71700]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 17:52:34 compute-0 sudo[71698]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:34 compute-0 sudo[71850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soohxofqzzjxfnacnzsxnunxwgtrpsjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017954.550786-736-203777649408417/AnsiballZ_stat.py'
Jan 21 17:52:34 compute-0 sudo[71850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:35 compute-0 python3.9[71852]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:35 compute-0 sudo[71850]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:35 compute-0 sudo[71973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmjlvaqpmomcisaocsjlmlgnrpyxkgeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017954.550786-736-203777649408417/AnsiballZ_copy.py'
Jan 21 17:52:35 compute-0 sudo[71973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:35 compute-0 python3.9[71975]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017954.550786-736-203777649408417/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:35 compute-0 sudo[71973]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:36 compute-0 sudo[72125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqzsnvljfzprzpkofngusimcuduyqfev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017955.818409-766-274971528218197/AnsiballZ_stat.py'
Jan 21 17:52:36 compute-0 sudo[72125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:36 compute-0 python3.9[72127]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:36 compute-0 sudo[72125]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:36 compute-0 sudo[72248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhchvdmungxqfvhnmrhkopdhoczpupxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017955.818409-766-274971528218197/AnsiballZ_copy.py'
Jan 21 17:52:36 compute-0 sudo[72248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:36 compute-0 python3.9[72250]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017955.818409-766-274971528218197/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:36 compute-0 sudo[72248]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:37 compute-0 sudo[72400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajoqmzogzhikhqygtsphkpnoknigmlms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017957.0690486-796-278850983009273/AnsiballZ_stat.py'
Jan 21 17:52:37 compute-0 sudo[72400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:37 compute-0 python3.9[72402]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:37 compute-0 sudo[72400]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:37 compute-0 sudo[72523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnaaudsaxwedoenhyvqdovwfxycjbvoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017957.0690486-796-278850983009273/AnsiballZ_copy.py'
Jan 21 17:52:37 compute-0 sudo[72523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:38 compute-0 python3.9[72525]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017957.0690486-796-278850983009273/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:38 compute-0 sudo[72523]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:38 compute-0 sudo[72675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbkegeufwsspfdmnewvfqqklyabnswfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017958.2344384-826-33533515296321/AnsiballZ_stat.py'
Jan 21 17:52:38 compute-0 sudo[72675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:38 compute-0 python3.9[72677]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:38 compute-0 sudo[72675]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:39 compute-0 sudo[72798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyqyqcxdxpikwdphkxiteqjbkkqgvwyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017958.2344384-826-33533515296321/AnsiballZ_copy.py'
Jan 21 17:52:39 compute-0 sudo[72798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:39 compute-0 python3.9[72800]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017958.2344384-826-33533515296321/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:39 compute-0 sudo[72798]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:39 compute-0 sudo[72950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viwhrbzdbzevmmleidrhsoumrapamzmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017959.5429864-856-143329501893963/AnsiballZ_stat.py'
Jan 21 17:52:39 compute-0 sudo[72950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:40 compute-0 python3.9[72952]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:52:40 compute-0 sudo[72950]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:40 compute-0 sudo[73073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcrdapjcuqqxoppszykgoyftacsizsjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017959.5429864-856-143329501893963/AnsiballZ_copy.py'
Jan 21 17:52:40 compute-0 sudo[73073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:40 compute-0 python3.9[73075]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769017959.5429864-856-143329501893963/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:40 compute-0 sudo[73073]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:41 compute-0 sudo[73225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcscykdfdqyjwwdvhocpwubvldykutcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017960.799398-886-205075808812771/AnsiballZ_file.py'
Jan 21 17:52:41 compute-0 sudo[73225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:41 compute-0 python3.9[73227]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:41 compute-0 sudo[73225]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:41 compute-0 sudo[73377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agdajxkuivisgmyueifxcdynhtatnhwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017961.4933112-902-61450901433187/AnsiballZ_command.py'
Jan 21 17:52:41 compute-0 sudo[73377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:42 compute-0 python3.9[73379]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:52:42 compute-0 sudo[73377]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:42 compute-0 sudo[73536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buouqrqnrayxolggazdyrvilwzdknfbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017962.2592764-918-9985809509510/AnsiballZ_blockinfile.py'
Jan 21 17:52:42 compute-0 sudo[73536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:42 compute-0 python3.9[73538]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:42 compute-0 sudo[73536]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:43 compute-0 sudo[73689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbyjlddluzgjlkyrjaftvxnryzvpxvno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017963.1383312-936-139488520177190/AnsiballZ_file.py'
Jan 21 17:52:43 compute-0 sudo[73689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:43 compute-0 python3.9[73691]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:43 compute-0 sudo[73689]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:43 compute-0 sudo[73841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jupuximdelxhnqjdkhqdrauqujpwvbwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017963.6921766-936-256433770356139/AnsiballZ_file.py'
Jan 21 17:52:43 compute-0 sudo[73841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:44 compute-0 python3.9[73843]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:44 compute-0 sudo[73841]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:44 compute-0 sudo[73993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dswkfsjhnqnoszajovegcwyyphphyftt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017964.4086492-966-98421598008903/AnsiballZ_mount.py'
Jan 21 17:52:44 compute-0 sudo[73993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:45 compute-0 python3.9[73995]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 21 17:52:45 compute-0 sudo[73993]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:45 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 17:52:45 compute-0 sudo[74147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtecaozjadsbmsprndsfcadpbvslhmcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017965.301757-966-216838930014893/AnsiballZ_mount.py'
Jan 21 17:52:45 compute-0 sudo[74147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:45 compute-0 python3.9[74149]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 21 17:52:45 compute-0 sudo[74147]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:46 compute-0 sshd-session[64981]: Connection closed by 192.168.122.30 port 49638
Jan 21 17:52:46 compute-0 sshd-session[64978]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:52:46 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Jan 21 17:52:46 compute-0 systemd[1]: session-15.scope: Consumed 33.944s CPU time.
Jan 21 17:52:46 compute-0 systemd-logind[782]: Session 15 logged out. Waiting for processes to exit.
Jan 21 17:52:46 compute-0 systemd-logind[782]: Removed session 15.
Jan 21 17:52:51 compute-0 sshd-session[74175]: Accepted publickey for zuul from 192.168.122.30 port 60104 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:52:51 compute-0 systemd-logind[782]: New session 16 of user zuul.
Jan 21 17:52:51 compute-0 systemd[1]: Started Session 16 of User zuul.
Jan 21 17:52:51 compute-0 sshd-session[74175]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:52:52 compute-0 sudo[74328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oibyphkbcyvkndctythiaglmqyagpzly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017971.7412837-17-61349361575018/AnsiballZ_tempfile.py'
Jan 21 17:52:52 compute-0 sudo[74328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:52 compute-0 python3.9[74330]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 21 17:52:52 compute-0 sudo[74328]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:53 compute-0 sudo[74480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqirsfznjgmcdlxlxlkprruhnspkkwxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017972.5670917-41-209300649560603/AnsiballZ_stat.py'
Jan 21 17:52:53 compute-0 sudo[74480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:53 compute-0 python3.9[74482]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:52:53 compute-0 sudo[74480]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:54 compute-0 sudo[74632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywamyyietzgrnsmojzodyrozkdqublfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017973.4697833-61-189891681647175/AnsiballZ_setup.py'
Jan 21 17:52:54 compute-0 sudo[74632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:54 compute-0 python3.9[74634]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:52:54 compute-0 sudo[74632]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:55 compute-0 sudo[74784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agghbvgrverdvffarfkcwuxbuwmuxbdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017974.678417-78-80738276445872/AnsiballZ_blockinfile.py'
Jan 21 17:52:55 compute-0 sudo[74784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:55 compute-0 python3.9[74786]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCdfv9/W8hvb+UfljIHuSMV0WqMyFEdwj5eGT0NsMMpe+FtsABSibemZRI7yWBnCINuvumIzXPUr9CPsQwd1TQ9AAyBbbUQrBK4L/4Y10FtftdLhZzS/LpuMvbnpX3rUsDYQTiZhWh00dRU9z3eqIKsdwfOxp33UjL41VbrkLg+3dcLMad4hY0snnOQixWh9YmHJICJvx3jTrK4yZM93P5Ys2NtRbG4jVxZwQ+veCNAo4V9JFRrrGA18cIghXpZB5b1iPXYgR66hkVDOo5Yday9DffuQf9T8pY7NNa/X0CBY5O/CJD4v2Z6v1JEBzCvAGadhOvk5x6xVDqDR3rRjtVIBU/rObzw4vAQQkYWZLzG81SXEIf/43CUWWtT5txvjxd8FZY7IZn7xroTcmEteeTPFyv/Bl+iJl5YxtWI0C74V2rvtkZ78hUVikkpIgT281C8TFALcD3LVHpxITGbHO1ttufNlwqtVurX6SEtUozmciMa3jVua3Btb7MKWqhrDsk=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPsGeahNw2BwmUsu+cW3yoOrhGi5GD37xzoLpc7QdCUQ
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOLb1qBcrW0ADXTB9rrZVddS8kdYPsIj8CnIgUsxjVYxRPnecQp8O3HUM8SEhSOPrsOScJjS1vodTs6ntMMm6kw=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2atnCiyt+kZn6huj0P6VWn+3B4dI4Nmp/hg4yi9j532CLA6SOijysE/70pHcm0B/NDqGLsMeL9S6EJuVc4v2EZIoGjYqzdOeqzUChkzEMq3c8dUeVW1dJgk+r+LtN+0NBDcYLT2h5a5Ct0NqRWLH+yOgTnQpgg+h0DV2Y56UyBre8LlYUBxyDDzGZplNwZ6PmoQaET4vT5MtDQwSL7/Idygmoc8SZHF5q2WLxYhdy7tyxDc8Qk02xI30CGGAL+7UTkWStpXI0T5YWeZk9iYrG/Pkb9Nu+XKoKChxKAXKtQhjYBljj1jVxnV3wXscEyy4NOihEA81giDD0r7zFuemhz9NsVDmq+F06Yc8Pqm5oMVBKy1E/TF+l26XlV35ZgPF13aYeh88v2fprz5F90OiBHMcc7gABwvPEkJ5qjIIfECFiZnAU9o168xKD/WRR3wbN1RmL+vJO2h4aW2GmJhrlng2DffujyABXpkCCx7ktUXJU6JrD9kjxKIk4gce2sr0=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIRPb+PoG5fO3HnP+tAbI98+jNuFDtjC3haBwBU8NRJG
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKr7gjMj8iaCIx3TBdZgIzQJEgUH5Co4gGNPtqAqRhLrurA6L75DozgtjB/fAoh+DGd7bl+UWW3oiKKyki6X3AA=
                                             create=True mode=0644 path=/tmp/ansible.rd3c1qvs state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:55 compute-0 sudo[74784]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:56 compute-0 sudo[74936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exenahobjkjwffqgmvxylrpjlypjshoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017975.5641377-94-62111611709609/AnsiballZ_command.py'
Jan 21 17:52:56 compute-0 sudo[74936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:56 compute-0 python3.9[74938]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.rd3c1qvs' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:52:56 compute-0 sudo[74936]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:57 compute-0 sudo[75090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cucfeosxyoezgjrftdwllqzfcfawjxpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017976.5167084-110-190442393489024/AnsiballZ_file.py'
Jan 21 17:52:57 compute-0 sudo[75090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:52:57 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 21 17:52:57 compute-0 python3.9[75092]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.rd3c1qvs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:52:57 compute-0 sudo[75090]: pam_unix(sudo:session): session closed for user root
Jan 21 17:52:58 compute-0 sshd-session[74178]: Connection closed by 192.168.122.30 port 60104
Jan 21 17:52:58 compute-0 sshd-session[74175]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:52:58 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Jan 21 17:52:58 compute-0 systemd[1]: session-16.scope: Consumed 3.644s CPU time.
Jan 21 17:52:58 compute-0 systemd-logind[782]: Session 16 logged out. Waiting for processes to exit.
Jan 21 17:52:58 compute-0 systemd-logind[782]: Removed session 16.
Jan 21 17:53:04 compute-0 sshd-session[75119]: Accepted publickey for zuul from 192.168.122.30 port 52924 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:53:04 compute-0 systemd-logind[782]: New session 17 of user zuul.
Jan 21 17:53:04 compute-0 systemd[1]: Started Session 17 of User zuul.
Jan 21 17:53:04 compute-0 sshd-session[75119]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:53:05 compute-0 python3.9[75272]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:53:06 compute-0 sudo[75426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osjwahszjyvuhybnqurtlmageybxlewv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017985.5742633-39-183382713376964/AnsiballZ_systemd.py'
Jan 21 17:53:06 compute-0 sudo[75426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:06 compute-0 python3.9[75428]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 21 17:53:06 compute-0 sudo[75426]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:07 compute-0 sudo[75580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jycrvqhtjjysuojoxyncqsqfsdvbgigu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017986.8054342-55-233196565383059/AnsiballZ_systemd.py'
Jan 21 17:53:07 compute-0 sudo[75580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:07 compute-0 python3.9[75582]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 17:53:07 compute-0 sudo[75580]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:08 compute-0 sudo[75733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iztytbnocffqhoxclqxytnzoebtkgspn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017987.7099829-73-232526587254827/AnsiballZ_command.py'
Jan 21 17:53:08 compute-0 sudo[75733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:08 compute-0 python3.9[75735]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:53:08 compute-0 sudo[75733]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:09 compute-0 sudo[75886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awqlbbahqtskrzskxvxdwpaimoodoitn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017988.5626242-89-80697422012334/AnsiballZ_stat.py'
Jan 21 17:53:09 compute-0 sudo[75886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:09 compute-0 python3.9[75888]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:53:09 compute-0 sudo[75886]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:11 compute-0 sudo[76040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgocmdatxiyocjpcgwbyoomufkcrxziu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017990.9531488-105-238452933660907/AnsiballZ_command.py'
Jan 21 17:53:11 compute-0 sudo[76040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:11 compute-0 python3.9[76042]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:53:11 compute-0 sudo[76040]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:12 compute-0 sudo[76195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcvuynnvmdahwpeshhiortrqokpctxwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017991.6995008-121-47546422892562/AnsiballZ_file.py'
Jan 21 17:53:12 compute-0 sudo[76195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:12 compute-0 python3.9[76197]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:12 compute-0 sudo[76195]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:12 compute-0 sshd-session[75122]: Connection closed by 192.168.122.30 port 52924
Jan 21 17:53:12 compute-0 sshd-session[75119]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:53:12 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Jan 21 17:53:12 compute-0 systemd[1]: session-17.scope: Consumed 4.881s CPU time.
Jan 21 17:53:12 compute-0 systemd-logind[782]: Session 17 logged out. Waiting for processes to exit.
Jan 21 17:53:12 compute-0 systemd-logind[782]: Removed session 17.
Jan 21 17:53:17 compute-0 sshd-session[76222]: Accepted publickey for zuul from 192.168.122.30 port 46948 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:53:18 compute-0 systemd-logind[782]: New session 18 of user zuul.
Jan 21 17:53:18 compute-0 systemd[1]: Started Session 18 of User zuul.
Jan 21 17:53:18 compute-0 sshd-session[76222]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:53:19 compute-0 python3.9[76375]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:53:19 compute-0 sudo[76529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhyhuqmmwmwobxylrcxmmheiibjwxnoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017999.605168-43-23069139879944/AnsiballZ_setup.py'
Jan 21 17:53:19 compute-0 sudo[76529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:20 compute-0 python3.9[76531]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:53:20 compute-0 sudo[76529]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:20 compute-0 sudo[76613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyrymmhftcvfqfiearseeswrfzuujqut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769017999.605168-43-23069139879944/AnsiballZ_dnf.py'
Jan 21 17:53:20 compute-0 sudo[76613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:21 compute-0 python3.9[76615]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 17:53:22 compute-0 sudo[76613]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:23 compute-0 python3.9[76766]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:53:24 compute-0 python3.9[76917]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 17:53:25 compute-0 python3.9[77067]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:53:25 compute-0 python3.9[77217]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:53:26 compute-0 sshd-session[76225]: Connection closed by 192.168.122.30 port 46948
Jan 21 17:53:26 compute-0 sshd-session[76222]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:53:26 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Jan 21 17:53:26 compute-0 systemd[1]: session-18.scope: Consumed 5.903s CPU time.
Jan 21 17:53:26 compute-0 systemd-logind[782]: Session 18 logged out. Waiting for processes to exit.
Jan 21 17:53:26 compute-0 systemd-logind[782]: Removed session 18.
Jan 21 17:53:32 compute-0 sshd-session[77242]: Accepted publickey for zuul from 192.168.122.30 port 54286 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:53:32 compute-0 systemd-logind[782]: New session 19 of user zuul.
Jan 21 17:53:32 compute-0 systemd[1]: Started Session 19 of User zuul.
Jan 21 17:53:32 compute-0 sshd-session[77242]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:53:33 compute-0 python3.9[77395]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:53:35 compute-0 sudo[77549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxzppwvqnmlnjgvjqugctpjbctypzacb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018014.7877862-76-277341732452441/AnsiballZ_file.py'
Jan 21 17:53:35 compute-0 sudo[77549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:35 compute-0 python3.9[77551]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:35 compute-0 sudo[77549]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:35 compute-0 sudo[77701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upioxqvcybuigegemiwbjuojdqjvjitx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018015.4504006-76-66804598750011/AnsiballZ_file.py'
Jan 21 17:53:35 compute-0 sudo[77701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:35 compute-0 python3.9[77703]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:35 compute-0 sudo[77701]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:36 compute-0 sudo[77853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbpjutuzvjkeratpxgfxrdvtlmfsakbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018016.040695-104-152387924382850/AnsiballZ_stat.py'
Jan 21 17:53:36 compute-0 sudo[77853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:36 compute-0 python3.9[77855]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:36 compute-0 sudo[77853]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:37 compute-0 sudo[77978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vswnsevuyunzehnuyjvrzhncqhqpejqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018016.040695-104-152387924382850/AnsiballZ_copy.py'
Jan 21 17:53:37 compute-0 sudo[77978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:37 compute-0 python3.9[77980]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018016.040695-104-152387924382850/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=eb1f3f258219ebab3b20c0f146265051303be881 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:37 compute-0 sshd-session[77926]: Invalid user ubuntu from 64.227.98.100 port 34944
Jan 21 17:53:37 compute-0 sudo[77978]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:37 compute-0 sshd-session[77926]: Connection closed by invalid user ubuntu 64.227.98.100 port 34944 [preauth]
Jan 21 17:53:37 compute-0 sudo[78130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doulfflnqxpqvyntmsjjswdaptmoorhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018017.5953987-104-53008409508395/AnsiballZ_stat.py'
Jan 21 17:53:37 compute-0 sudo[78130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:38 compute-0 python3.9[78132]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:38 compute-0 sudo[78130]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:38 compute-0 sudo[78253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtzarfnyddvidwtegkqlkfqumzbhyben ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018017.5953987-104-53008409508395/AnsiballZ_copy.py'
Jan 21 17:53:38 compute-0 sudo[78253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:38 compute-0 python3.9[78255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018017.5953987-104-53008409508395/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=47898ba1c114a730e6ef2f3da4f6b4ba2f67da3a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:38 compute-0 sudo[78253]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:38 compute-0 sudo[78405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqvzrujnodcrfxnejujekijmqbdgdmnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018018.6144345-104-174492270612473/AnsiballZ_stat.py'
Jan 21 17:53:38 compute-0 sudo[78405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:39 compute-0 python3.9[78407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:39 compute-0 sudo[78405]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:39 compute-0 sudo[78528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzajmxjfpxifbnxgebkezotscifxwicl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018018.6144345-104-174492270612473/AnsiballZ_copy.py'
Jan 21 17:53:39 compute-0 sudo[78528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:39 compute-0 python3.9[78530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018018.6144345-104-174492270612473/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=307ae909eabf4cb6cd7b1cf7683feb228de61829 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:39 compute-0 sudo[78528]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:40 compute-0 sudo[78680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iulbbvpuxwqayggxscfujocehlzbwduk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018019.8084145-191-265355247551363/AnsiballZ_file.py'
Jan 21 17:53:40 compute-0 sudo[78680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:40 compute-0 python3.9[78682]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:40 compute-0 sudo[78680]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:40 compute-0 sudo[78832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqkkopvizccxynpnljcffxajsmfyxfpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018020.3736525-191-220575528214261/AnsiballZ_file.py'
Jan 21 17:53:40 compute-0 sudo[78832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:40 compute-0 python3.9[78834]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:40 compute-0 sudo[78832]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:41 compute-0 sudo[78984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzpqweubgtoilhtzolckovilkpthnbei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018021.0957148-222-122091359097338/AnsiballZ_stat.py'
Jan 21 17:53:41 compute-0 sudo[78984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:41 compute-0 python3.9[78986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:41 compute-0 sudo[78984]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:41 compute-0 sudo[79107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lopwqhoumonxmddnmidpnnngnjaezqnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018021.0957148-222-122091359097338/AnsiballZ_copy.py'
Jan 21 17:53:41 compute-0 sudo[79107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:42 compute-0 python3.9[79109]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018021.0957148-222-122091359097338/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=009098885abf7cb4d35603defa24fd59056ea611 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:42 compute-0 sudo[79107]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:42 compute-0 sudo[79259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwwzhwhjrfekowrcefsmffuedlnmnliw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018022.2363827-222-232768864287588/AnsiballZ_stat.py'
Jan 21 17:53:42 compute-0 sudo[79259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:42 compute-0 python3.9[79261]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:42 compute-0 sudo[79259]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:43 compute-0 sudo[79382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vihmbbvpllcnsjvovzzauphonjzekaol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018022.2363827-222-232768864287588/AnsiballZ_copy.py'
Jan 21 17:53:43 compute-0 sudo[79382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:43 compute-0 python3.9[79384]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018022.2363827-222-232768864287588/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=3bc514afdcff74105fe9d557b77ef9b111d2ace0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:43 compute-0 sudo[79382]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:43 compute-0 sudo[79534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncuqenbiskswrzhzcenxxdfopwknipeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018023.3905344-222-150292547662057/AnsiballZ_stat.py'
Jan 21 17:53:43 compute-0 sudo[79534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:43 compute-0 python3.9[79536]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:43 compute-0 sudo[79534]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:44 compute-0 sudo[79657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahifqyddvhgvlenqjybfdiimujsbysjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018023.3905344-222-150292547662057/AnsiballZ_copy.py'
Jan 21 17:53:44 compute-0 sudo[79657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:44 compute-0 python3.9[79659]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018023.3905344-222-150292547662057/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7a6d4c5f1581c2d451808ed6e3d0051bddb95f37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:44 compute-0 sudo[79657]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:44 compute-0 sudo[79809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfdikwmhprshqulprnqhpercjoutxgwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018024.683902-309-266305848881918/AnsiballZ_file.py'
Jan 21 17:53:44 compute-0 sudo[79809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:45 compute-0 python3.9[79811]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:45 compute-0 sudo[79809]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:45 compute-0 sudo[79961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdwlhtnvvgwaxetkuoolsdojubvuccys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018025.292246-309-64507738426896/AnsiballZ_file.py'
Jan 21 17:53:45 compute-0 sudo[79961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:45 compute-0 python3.9[79963]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:45 compute-0 sudo[79961]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:46 compute-0 sudo[80113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqkjbrgustznocprlgqjhkjraylozssp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018025.8964121-340-176914797898466/AnsiballZ_stat.py'
Jan 21 17:53:46 compute-0 sudo[80113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:46 compute-0 python3.9[80115]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:46 compute-0 sudo[80113]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:46 compute-0 sudo[80236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bokolwliewhjcofwtntwpfettpuhfvom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018025.8964121-340-176914797898466/AnsiballZ_copy.py'
Jan 21 17:53:46 compute-0 sudo[80236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:46 compute-0 python3.9[80238]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018025.8964121-340-176914797898466/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=ef6eab7d4a2a8e6fa71379397da261bf3175c1e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:47 compute-0 sudo[80236]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:47 compute-0 sudo[80388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jllvokplzsiudooohtzzagfnygwgkskf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018027.1343508-340-237807882242777/AnsiballZ_stat.py'
Jan 21 17:53:47 compute-0 sudo[80388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:47 compute-0 python3.9[80390]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:47 compute-0 sudo[80388]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:48 compute-0 sudo[80511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynzvlcuwqxrasooynhunlzptlvhlhtef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018027.1343508-340-237807882242777/AnsiballZ_copy.py'
Jan 21 17:53:48 compute-0 sudo[80511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:48 compute-0 python3.9[80513]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018027.1343508-340-237807882242777/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=7427353246cb086e103daf2224f495cbd175c55e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:48 compute-0 sudo[80511]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:48 compute-0 sudo[80663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oesyydepyzxdobmgwrkamejxuviwdhif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018028.4063385-340-98533683750670/AnsiballZ_stat.py'
Jan 21 17:53:48 compute-0 sudo[80663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:48 compute-0 python3.9[80665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:48 compute-0 sudo[80663]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:49 compute-0 sudo[80786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkhtvcapymvkmpmnrcrbzebgdfvxcbje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018028.4063385-340-98533683750670/AnsiballZ_copy.py'
Jan 21 17:53:49 compute-0 sudo[80786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:49 compute-0 python3.9[80788]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018028.4063385-340-98533683750670/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=209fd258a0385951fb069bc6a5cf3e0446431e44 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:49 compute-0 sudo[80786]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:50 compute-0 sudo[80938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnzcrgjkvzaiaaqxosffzddqgdhpqxek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018029.6980488-436-218336329074818/AnsiballZ_file.py'
Jan 21 17:53:50 compute-0 sudo[80938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:50 compute-0 python3.9[80940]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:50 compute-0 sudo[80938]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:50 compute-0 sudo[81090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trxeuxgnokiyjjzqhgzdrgamhfbykwec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018030.337542-436-233974354693849/AnsiballZ_file.py'
Jan 21 17:53:50 compute-0 sudo[81090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:50 compute-0 python3.9[81092]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:50 compute-0 sudo[81090]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:51 compute-0 sudo[81242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxpimhkbnkefucaqirebmshwgkhepeyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018030.9720926-465-76892783184151/AnsiballZ_stat.py'
Jan 21 17:53:51 compute-0 sudo[81242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:51 compute-0 python3.9[81244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:51 compute-0 sudo[81242]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:51 compute-0 sudo[81365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umkytohbrnzuwljjkaqhuknpbelxidhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018030.9720926-465-76892783184151/AnsiballZ_copy.py'
Jan 21 17:53:51 compute-0 sudo[81365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:51 compute-0 python3.9[81367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018030.9720926-465-76892783184151/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=89aed0e02dfa7f7f675946adfd851c7b720fa61d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:51 compute-0 sudo[81365]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:52 compute-0 sudo[81517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucucmhwwsfhoztqybhsgapsjuiliqqia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018032.0249436-465-277659413543130/AnsiballZ_stat.py'
Jan 21 17:53:52 compute-0 sudo[81517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:52 compute-0 python3.9[81519]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:52 compute-0 sudo[81517]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:52 compute-0 sudo[81640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypdtewtuwoinqciraldzgscdasndrurb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018032.0249436-465-277659413543130/AnsiballZ_copy.py'
Jan 21 17:53:52 compute-0 sudo[81640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:52 compute-0 python3.9[81642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018032.0249436-465-277659413543130/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=7427353246cb086e103daf2224f495cbd175c55e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:53 compute-0 sudo[81640]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:53 compute-0 sudo[81792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dknyzzuqncttasuccbekdmxgbbbthfyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018033.13325-465-121299696943681/AnsiballZ_stat.py'
Jan 21 17:53:53 compute-0 sudo[81792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:53 compute-0 python3.9[81794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:53 compute-0 sudo[81792]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:53 compute-0 sudo[81915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wasqhjylhuimidpnhlyrblcbylshpmoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018033.13325-465-121299696943681/AnsiballZ_copy.py'
Jan 21 17:53:53 compute-0 sudo[81915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:54 compute-0 python3.9[81917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018033.13325-465-121299696943681/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=30bf06a6c86a829695737ccf294bd502d2d66529 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:54 compute-0 sudo[81915]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:54 compute-0 sudo[82067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxnsftgqwgrtezdykfggqqylqchqzuos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018034.7360327-579-167656643249309/AnsiballZ_file.py'
Jan 21 17:53:54 compute-0 sudo[82067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:55 compute-0 python3.9[82069]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:55 compute-0 sudo[82067]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:55 compute-0 sudo[82219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbczwfuexqmigkszacwgmnfkmilfneiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018035.303517-594-236120168673220/AnsiballZ_stat.py'
Jan 21 17:53:55 compute-0 sudo[82219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:55 compute-0 python3.9[82221]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:55 compute-0 sudo[82219]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:56 compute-0 sudo[82342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iljunvubfxyuscxqxavvjuvfbbgaelok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018035.303517-594-236120168673220/AnsiballZ_copy.py'
Jan 21 17:53:56 compute-0 sudo[82342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:56 compute-0 python3.9[82344]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018035.303517-594-236120168673220/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b2ca5e5f576f827289b8dc0eb476f75fc973645a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:56 compute-0 sudo[82342]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:56 compute-0 sudo[82494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvvfpmltzrrzuhauaieqdzxfssqbdgmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018036.4193957-625-184836810458790/AnsiballZ_file.py'
Jan 21 17:53:56 compute-0 sudo[82494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:56 compute-0 python3.9[82496]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:56 compute-0 sudo[82494]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:57 compute-0 sudo[82646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zneimpljclsvhkdlwnnzaytfovzahsso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018037.0457594-642-230284205783466/AnsiballZ_stat.py'
Jan 21 17:53:57 compute-0 sudo[82646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:57 compute-0 python3.9[82648]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:57 compute-0 sudo[82646]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:57 compute-0 sudo[82769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alwodwnusjtxypnvzeleoaidtozhuldz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018037.0457594-642-230284205783466/AnsiballZ_copy.py'
Jan 21 17:53:57 compute-0 sudo[82769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:58 compute-0 python3.9[82771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018037.0457594-642-230284205783466/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b2ca5e5f576f827289b8dc0eb476f75fc973645a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:58 compute-0 sudo[82769]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:58 compute-0 sudo[82921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijbxphepbocbcpckhkzwtlyxybpissas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018038.2313528-673-231934162491120/AnsiballZ_file.py'
Jan 21 17:53:58 compute-0 sudo[82921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:58 compute-0 python3.9[82923]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:53:58 compute-0 sudo[82921]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:59 compute-0 sudo[83073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuqhrvoxcnujfcdqsgpgiiykfiaisgyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018038.8474827-690-184512252575788/AnsiballZ_stat.py'
Jan 21 17:53:59 compute-0 sudo[83073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:59 compute-0 python3.9[83075]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:53:59 compute-0 sudo[83073]: pam_unix(sudo:session): session closed for user root
Jan 21 17:53:59 compute-0 sudo[83196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jombbhsbwengxgzgehwjksmevlrbicri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018038.8474827-690-184512252575788/AnsiballZ_copy.py'
Jan 21 17:53:59 compute-0 sudo[83196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:53:59 compute-0 python3.9[83198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018038.8474827-690-184512252575788/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b2ca5e5f576f827289b8dc0eb476f75fc973645a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:53:59 compute-0 sudo[83196]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:00 compute-0 sudo[83348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdtbwyhnrcnmnqetahtmunlxfdmexcih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018040.0116339-720-139660195405844/AnsiballZ_file.py'
Jan 21 17:54:00 compute-0 sudo[83348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:00 compute-0 python3.9[83350]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:54:00 compute-0 sudo[83348]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:00 compute-0 sudo[83500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfomujxgxtzalggmbhbvfoqsqdjmcfbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018040.6369298-736-141213072234322/AnsiballZ_stat.py'
Jan 21 17:54:00 compute-0 sudo[83500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:01 compute-0 python3.9[83502]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:01 compute-0 sudo[83500]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:01 compute-0 sudo[83623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjbukyeuqeoghpjndpalmhlzxubekjcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018040.6369298-736-141213072234322/AnsiballZ_copy.py'
Jan 21 17:54:01 compute-0 sudo[83623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:01 compute-0 python3.9[83625]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018040.6369298-736-141213072234322/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b2ca5e5f576f827289b8dc0eb476f75fc973645a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:01 compute-0 sudo[83623]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:02 compute-0 sudo[83775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmzlxcqfcxwrnkkxypnldtmjkmvlrqep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018041.9575808-767-256961214763180/AnsiballZ_file.py'
Jan 21 17:54:02 compute-0 sudo[83775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:02 compute-0 python3.9[83777]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:54:02 compute-0 sudo[83775]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:02 compute-0 sudo[83927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvafpxcnaftsqfouaghmjmouwxaiozsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018042.6081245-784-171097499592519/AnsiballZ_stat.py'
Jan 21 17:54:02 compute-0 sudo[83927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:03 compute-0 python3.9[83929]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:03 compute-0 sudo[83927]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:03 compute-0 sudo[84050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmpxuecttwgpbbcimuaqkzsweogufciy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018042.6081245-784-171097499592519/AnsiballZ_copy.py'
Jan 21 17:54:03 compute-0 sudo[84050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:03 compute-0 python3.9[84052]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018042.6081245-784-171097499592519/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b2ca5e5f576f827289b8dc0eb476f75fc973645a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:03 compute-0 sudo[84050]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:04 compute-0 sudo[84202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnsmjwujralshxxnztdaeebmaykegqzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018043.7670686-813-265558304584272/AnsiballZ_file.py'
Jan 21 17:54:04 compute-0 sudo[84202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:04 compute-0 python3.9[84204]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:54:04 compute-0 sudo[84202]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:04 compute-0 sudo[84354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlvrtpqclvtuyqgqsmavkbocbqulwunm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018044.3536208-827-229109864765063/AnsiballZ_stat.py'
Jan 21 17:54:04 compute-0 sudo[84354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:04 compute-0 python3.9[84356]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:04 compute-0 sudo[84354]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:05 compute-0 sudo[84477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bctvvkqjeaphadrkirermcxbicowbfto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018044.3536208-827-229109864765063/AnsiballZ_copy.py'
Jan 21 17:54:05 compute-0 sudo[84477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:05 compute-0 python3.9[84479]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018044.3536208-827-229109864765063/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b2ca5e5f576f827289b8dc0eb476f75fc973645a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:05 compute-0 sudo[84477]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:05 compute-0 sudo[84629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgfolwhcsmhkriwexemfsebelhcixuap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018045.4370213-858-269049870135008/AnsiballZ_file.py'
Jan 21 17:54:05 compute-0 sudo[84629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:05 compute-0 python3.9[84631]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:54:05 compute-0 sudo[84629]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:06 compute-0 sudo[84781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afhncnxfvfdmdxeakndupaukdrjyplsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018046.0418894-873-49548706508024/AnsiballZ_stat.py'
Jan 21 17:54:06 compute-0 sudo[84781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:06 compute-0 python3.9[84783]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:06 compute-0 sudo[84781]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:06 compute-0 sudo[84904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdobrvviwvyzwgngrqrhaatawvjxgqzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018046.0418894-873-49548706508024/AnsiballZ_copy.py'
Jan 21 17:54:06 compute-0 sudo[84904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:07 compute-0 python3.9[84906]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018046.0418894-873-49548706508024/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b2ca5e5f576f827289b8dc0eb476f75fc973645a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:07 compute-0 sudo[84904]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:08 compute-0 sshd-session[77245]: Connection closed by 192.168.122.30 port 54286
Jan 21 17:54:08 compute-0 sshd-session[77242]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:54:08 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Jan 21 17:54:08 compute-0 systemd[1]: session-19.scope: Consumed 27.712s CPU time.
Jan 21 17:54:08 compute-0 systemd-logind[782]: Session 19 logged out. Waiting for processes to exit.
Jan 21 17:54:08 compute-0 systemd-logind[782]: Removed session 19.
Jan 21 17:54:13 compute-0 sshd-session[84931]: Accepted publickey for zuul from 192.168.122.30 port 51232 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:54:13 compute-0 systemd-logind[782]: New session 20 of user zuul.
Jan 21 17:54:13 compute-0 systemd[1]: Started Session 20 of User zuul.
Jan 21 17:54:13 compute-0 sshd-session[84931]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:54:14 compute-0 python3.9[85084]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:54:15 compute-0 sudo[85238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jreejxhipshdfgmyxclzivuptjkuttbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018055.4305885-43-88046527696815/AnsiballZ_file.py'
Jan 21 17:54:15 compute-0 sudo[85238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:16 compute-0 python3.9[85240]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:54:16 compute-0 sudo[85238]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:16 compute-0 sudo[85390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bplyhmeytivlhsvmlcnruszkwmxlrqcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018056.1787999-43-83189111188047/AnsiballZ_file.py'
Jan 21 17:54:16 compute-0 sudo[85390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:16 compute-0 python3.9[85392]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:54:16 compute-0 sudo[85390]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:17 compute-0 python3.9[85542]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:54:18 compute-0 sudo[85692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdzdxhaqdazhtnzmmucskmcwhymzyumh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018057.6694107-89-106814775607967/AnsiballZ_seboolean.py'
Jan 21 17:54:18 compute-0 sudo[85692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:18 compute-0 python3.9[85694]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 21 17:54:19 compute-0 sudo[85692]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:20 compute-0 sudo[85848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kohbkqljlpafsgjfmscnmeiahajssttj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018060.5520368-109-191534002998677/AnsiballZ_setup.py'
Jan 21 17:54:20 compute-0 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 21 17:54:20 compute-0 sudo[85848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:21 compute-0 python3.9[85850]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:54:21 compute-0 sudo[85848]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:21 compute-0 sudo[85932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptjmkeaybygffsjxibautilerulfrxok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018060.5520368-109-191534002998677/AnsiballZ_dnf.py'
Jan 21 17:54:21 compute-0 sudo[85932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:21 compute-0 python3.9[85934]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:54:23 compute-0 sudo[85932]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:23 compute-0 sudo[86085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erovmlkufthwhhtziyvjbxqivylveayf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018063.3767073-133-81196209541981/AnsiballZ_systemd.py'
Jan 21 17:54:23 compute-0 sudo[86085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:24 compute-0 python3.9[86087]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 17:54:24 compute-0 sudo[86085]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:24 compute-0 sudo[86240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaovtmmegfjilgaezskdhdvqhjmbcabj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018064.5653596-149-245500636586436/AnsiballZ_edpm_nftables_snippet.py'
Jan 21 17:54:24 compute-0 sudo[86240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:25 compute-0 python3[86242]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 21 17:54:25 compute-0 sudo[86240]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:25 compute-0 sudo[86392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gffsdxtrxdwfbxzinabgpaysdfyehyan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018065.5702574-167-270133179462945/AnsiballZ_file.py'
Jan 21 17:54:25 compute-0 sudo[86392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:26 compute-0 python3.9[86394]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:26 compute-0 sudo[86392]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:26 compute-0 sudo[86544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjbanismqktyqrexovuwonulmkzhfzut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018066.1935456-183-101979035564723/AnsiballZ_stat.py'
Jan 21 17:54:26 compute-0 sudo[86544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:26 compute-0 python3.9[86546]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:26 compute-0 sudo[86544]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:27 compute-0 sudo[86622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wejmozllwncmgnpqgejzqfkpnehbvvyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018066.1935456-183-101979035564723/AnsiballZ_file.py'
Jan 21 17:54:27 compute-0 sudo[86622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:27 compute-0 python3.9[86624]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:27 compute-0 sudo[86622]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:27 compute-0 sudo[86774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnnvbmdavinosyajklqjdlfevawsaorl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018067.5025952-207-77612514958614/AnsiballZ_stat.py'
Jan 21 17:54:27 compute-0 sudo[86774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:28 compute-0 python3.9[86776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:28 compute-0 sudo[86774]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:28 compute-0 sudo[86852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnywgewublrapuyrobjqqtoszedibsrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018067.5025952-207-77612514958614/AnsiballZ_file.py'
Jan 21 17:54:28 compute-0 sudo[86852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:28 compute-0 python3.9[86854]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._533_kv1 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:28 compute-0 sudo[86852]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:29 compute-0 sudo[87004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwhcgrykdaqivgvjetbtoylvosdgwevt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018068.7866254-231-51728564074058/AnsiballZ_stat.py'
Jan 21 17:54:29 compute-0 sudo[87004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:29 compute-0 python3.9[87006]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:29 compute-0 sudo[87004]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:29 compute-0 sudo[87082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pddywprfufcsenwzrbjlpxvuneqhcmuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018068.7866254-231-51728564074058/AnsiballZ_file.py'
Jan 21 17:54:29 compute-0 sudo[87082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:29 compute-0 python3.9[87084]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:29 compute-0 sudo[87082]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:30 compute-0 sudo[87234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oopydlgpmiykziwvykqdcajtogtwyxxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018069.9480424-257-149148074283907/AnsiballZ_command.py'
Jan 21 17:54:30 compute-0 sudo[87234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:30 compute-0 python3.9[87236]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:30 compute-0 sudo[87234]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:31 compute-0 sudo[87387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkrsmjkmwmevtfgebzedixiencbbqmwp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018070.761718-273-168292900795117/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 17:54:31 compute-0 sudo[87387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:31 compute-0 python3[87389]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 17:54:31 compute-0 sudo[87387]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:31 compute-0 sudo[87539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mctvuhryqfynaembjqgrdwwufhlscgjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018071.6086338-289-98170536290477/AnsiballZ_stat.py'
Jan 21 17:54:31 compute-0 sudo[87539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:32 compute-0 python3.9[87541]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:32 compute-0 sudo[87539]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:32 compute-0 sudo[87664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtpmlaawtkunzkuupoaezltklvlgkffg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018071.6086338-289-98170536290477/AnsiballZ_copy.py'
Jan 21 17:54:32 compute-0 sudo[87664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:32 compute-0 python3.9[87666]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018071.6086338-289-98170536290477/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:32 compute-0 sudo[87664]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:33 compute-0 sudo[87817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlvaopgctadclankmftuyvcyihhupyxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018073.2553935-319-133000916618970/AnsiballZ_stat.py'
Jan 21 17:54:33 compute-0 sudo[87817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:33 compute-0 python3.9[87819]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:33 compute-0 sudo[87817]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:34 compute-0 sudo[87942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsppobhynxjlglhoylxfynalwryjxwii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018073.2553935-319-133000916618970/AnsiballZ_copy.py'
Jan 21 17:54:34 compute-0 sudo[87942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:34 compute-0 python3.9[87944]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018073.2553935-319-133000916618970/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:34 compute-0 sudo[87942]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:34 compute-0 sudo[88094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ircueiyeluahowmygyfogmjfkebnmtny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018074.5383842-349-272540204136206/AnsiballZ_stat.py'
Jan 21 17:54:34 compute-0 sudo[88094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:34 compute-0 python3.9[88096]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:35 compute-0 sudo[88094]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:35 compute-0 sudo[88220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqntnxikxgczbxnqbvvogtwdezbbswxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018074.5383842-349-272540204136206/AnsiballZ_copy.py'
Jan 21 17:54:35 compute-0 sudo[88220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:35 compute-0 python3.9[88222]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018074.5383842-349-272540204136206/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:35 compute-0 sudo[88220]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:36 compute-0 sudo[88372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jggzrttubtmnbfxxkofyblhtidcrbxap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018075.7666633-379-256839744266831/AnsiballZ_stat.py'
Jan 21 17:54:36 compute-0 sudo[88372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:36 compute-0 python3.9[88374]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:36 compute-0 sudo[88372]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:36 compute-0 sudo[88497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zprmflemxnerqyktbxzjdmldjghjtygo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018075.7666633-379-256839744266831/AnsiballZ_copy.py'
Jan 21 17:54:36 compute-0 sudo[88497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:36 compute-0 python3.9[88499]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018075.7666633-379-256839744266831/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:36 compute-0 sudo[88497]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:37 compute-0 sudo[88649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xftqvjwxazrpapbzfvasboududztnhbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018077.1154158-409-65065549255769/AnsiballZ_stat.py'
Jan 21 17:54:37 compute-0 sudo[88649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:37 compute-0 python3.9[88651]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:37 compute-0 sudo[88649]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:38 compute-0 sudo[88774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqrbrpxiccpkdbtbdhmerdvmhcsicpoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018077.1154158-409-65065549255769/AnsiballZ_copy.py'
Jan 21 17:54:38 compute-0 sudo[88774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:38 compute-0 python3.9[88776]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018077.1154158-409-65065549255769/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:38 compute-0 sudo[88774]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:38 compute-0 sudo[88926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmiethceicfkeghzmfezfwvlxxrolcal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018078.6279907-439-224624755401036/AnsiballZ_file.py'
Jan 21 17:54:38 compute-0 sudo[88926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:39 compute-0 python3.9[88928]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:39 compute-0 sudo[88926]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:39 compute-0 sudo[89078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlpfexhrlxnnohmudgjayygotbkqbkdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018079.3574042-455-32956222921901/AnsiballZ_command.py'
Jan 21 17:54:39 compute-0 sudo[89078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:39 compute-0 python3.9[89080]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:39 compute-0 sudo[89078]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:40 compute-0 sudo[89233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tytqtguyogowavtttiwzpfxhvackmzwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018080.097241-471-199682195052433/AnsiballZ_blockinfile.py'
Jan 21 17:54:40 compute-0 sudo[89233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:40 compute-0 python3.9[89235]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:40 compute-0 sudo[89233]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:41 compute-0 sudo[89385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkhhvuzzlritevlchvswugvnzpinjfsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018081.0573428-489-131610328966558/AnsiballZ_command.py'
Jan 21 17:54:41 compute-0 sudo[89385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:41 compute-0 python3.9[89387]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:41 compute-0 sudo[89385]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:42 compute-0 sudo[89538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thmwpjubmxwlpffoigsqaxzxqoqtyjvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018081.8172014-505-240180160138476/AnsiballZ_stat.py'
Jan 21 17:54:42 compute-0 sudo[89538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:42 compute-0 python3.9[89540]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:54:42 compute-0 sudo[89538]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:42 compute-0 sudo[89692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqdeechkcgwctpnqesvhaaooglxzndeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018082.4581993-521-73841378017794/AnsiballZ_command.py'
Jan 21 17:54:42 compute-0 sudo[89692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:42 compute-0 python3.9[89694]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:42 compute-0 sudo[89692]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:43 compute-0 sudo[89847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmbbskrzlgjcftrxysdjglnwoepgklrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018083.2880604-537-175754185609713/AnsiballZ_file.py'
Jan 21 17:54:43 compute-0 sudo[89847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:43 compute-0 python3.9[89849]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:43 compute-0 sudo[89847]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:44 compute-0 python3.9[89999]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:54:45 compute-0 sudo[90150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rommjofddfksjcxgcfbembivomusrnvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018085.5982893-617-62661313760786/AnsiballZ_command.py'
Jan 21 17:54:45 compute-0 sudo[90150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:46 compute-0 python3.9[90152]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:46 compute-0 ovs-vsctl[90153]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 21 17:54:46 compute-0 sudo[90150]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:46 compute-0 sudo[90303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlsklgplmhwtylvsfbfqhlgzpdtuejwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018086.3730178-635-180412888320951/AnsiballZ_command.py'
Jan 21 17:54:46 compute-0 sudo[90303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:46 compute-0 python3.9[90305]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:46 compute-0 sudo[90303]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:47 compute-0 sudo[90458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jexoanlyfyabqdgpgeaomzocjtnmsdjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018087.1258142-651-268029504210859/AnsiballZ_command.py'
Jan 21 17:54:47 compute-0 sudo[90458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:47 compute-0 python3.9[90460]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:47 compute-0 ovs-vsctl[90461]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 21 17:54:47 compute-0 sudo[90458]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:48 compute-0 python3.9[90611]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:54:49 compute-0 sudo[90763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoimcufriqajxodlrthyiesjnuvhonlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018089.1602347-685-186244414974335/AnsiballZ_file.py'
Jan 21 17:54:49 compute-0 sudo[90763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:49 compute-0 python3.9[90765]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:54:49 compute-0 sudo[90763]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:50 compute-0 sudo[90915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txoukctrvecdeujysivetmrkailyzwon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018089.8479145-701-146752888885080/AnsiballZ_stat.py'
Jan 21 17:54:50 compute-0 sudo[90915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:50 compute-0 python3.9[90917]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:50 compute-0 sudo[90915]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:50 compute-0 sudo[90993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmdzojoodwtvfnvtjojeotldiyqshdvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018089.8479145-701-146752888885080/AnsiballZ_file.py'
Jan 21 17:54:50 compute-0 sudo[90993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:50 compute-0 python3.9[90995]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:54:50 compute-0 sudo[90993]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:51 compute-0 sudo[91145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzzqxhzfekcnslltdrmtafdmazmgvakg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018090.9795182-701-177869297615903/AnsiballZ_stat.py'
Jan 21 17:54:51 compute-0 sudo[91145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:51 compute-0 python3.9[91147]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:51 compute-0 sudo[91145]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:51 compute-0 sudo[91223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkcvkoztmcbfqeosyxlwjamervyusdxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018090.9795182-701-177869297615903/AnsiballZ_file.py'
Jan 21 17:54:51 compute-0 sudo[91223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:51 compute-0 python3.9[91225]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:54:51 compute-0 sudo[91223]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:52 compute-0 sudo[91375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppgrhagvbzqwxokkhilmjkxmzcnkwoua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018092.0642297-747-237515705438049/AnsiballZ_file.py'
Jan 21 17:54:52 compute-0 sudo[91375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:52 compute-0 python3.9[91377]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:52 compute-0 sudo[91375]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:52 compute-0 sudo[91527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlnbvffblxqxdltssrstbxvvrczsbhlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018092.704495-763-24574585049314/AnsiballZ_stat.py'
Jan 21 17:54:52 compute-0 sudo[91527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:53 compute-0 python3.9[91529]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:53 compute-0 sudo[91527]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:53 compute-0 sudo[91605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtctyxtejquotdvtswlwrldimuvmcpza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018092.704495-763-24574585049314/AnsiballZ_file.py'
Jan 21 17:54:53 compute-0 sudo[91605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:53 compute-0 python3.9[91607]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:53 compute-0 sudo[91605]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:54 compute-0 sudo[91757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axbnpqqnzcdjzcouyilgffyrlvjhtppr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018093.9268928-787-75314096805238/AnsiballZ_stat.py'
Jan 21 17:54:54 compute-0 sudo[91757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:54 compute-0 python3.9[91759]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:54 compute-0 sudo[91757]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:54 compute-0 sudo[91835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmtnfqombunkozqaaawpcpcookaxpfxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018093.9268928-787-75314096805238/AnsiballZ_file.py'
Jan 21 17:54:54 compute-0 sudo[91835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:54 compute-0 python3.9[91837]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:54 compute-0 sudo[91835]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:55 compute-0 sudo[91987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkakqigjcbcjlvuipntalhtqeafhwsdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018095.2447588-811-186755680017284/AnsiballZ_systemd.py'
Jan 21 17:54:55 compute-0 sudo[91987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:55 compute-0 python3.9[91989]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:54:55 compute-0 systemd[1]: Reloading.
Jan 21 17:54:55 compute-0 systemd-rc-local-generator[92017]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:54:55 compute-0 systemd-sysv-generator[92020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:54:56 compute-0 sudo[91987]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:56 compute-0 sudo[92177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nabatxjtwpcihsmhcaxacjjqbmnbwwys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018096.317655-827-82287888863088/AnsiballZ_stat.py'
Jan 21 17:54:56 compute-0 sudo[92177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:56 compute-0 python3.9[92179]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:56 compute-0 sudo[92177]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:57 compute-0 sudo[92255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbxksodbzbxpntjfwkewizxifwlrigil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018096.317655-827-82287888863088/AnsiballZ_file.py'
Jan 21 17:54:57 compute-0 sudo[92255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:57 compute-0 python3.9[92257]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:57 compute-0 sudo[92255]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:57 compute-0 sudo[92407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tponagoqlbendlohmapirfppulyqqpkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018097.4431126-851-40819749630101/AnsiballZ_stat.py'
Jan 21 17:54:57 compute-0 sudo[92407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:57 compute-0 python3.9[92409]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:54:58 compute-0 sudo[92407]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:58 compute-0 sudo[92485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knrxzvikigrusyygwwogzzlicjwttoow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018097.4431126-851-40819749630101/AnsiballZ_file.py'
Jan 21 17:54:58 compute-0 sudo[92485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:58 compute-0 python3.9[92487]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:58 compute-0 sudo[92485]: pam_unix(sudo:session): session closed for user root
Jan 21 17:54:59 compute-0 sudo[92637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqgjnywjuktzrylmwiaqalqsnctryegs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018098.6432323-875-275019035524826/AnsiballZ_systemd.py'
Jan 21 17:54:59 compute-0 sudo[92637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:54:59 compute-0 python3.9[92639]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:54:59 compute-0 systemd[1]: Reloading.
Jan 21 17:54:59 compute-0 systemd-rc-local-generator[92663]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:54:59 compute-0 systemd-sysv-generator[92670]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:55:00 compute-0 systemd[1]: Starting Create netns directory...
Jan 21 17:55:00 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 17:55:00 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 17:55:00 compute-0 systemd[1]: Finished Create netns directory.
Jan 21 17:55:00 compute-0 sudo[92637]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:01 compute-0 sudo[92832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jamrrvuduslitrymkqgvyxnyuyiehrdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018100.9350994-895-210907336133088/AnsiballZ_file.py'
Jan 21 17:55:01 compute-0 sudo[92832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:01 compute-0 python3.9[92834]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:01 compute-0 sudo[92832]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:01 compute-0 sudo[92984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovcryjtrfxygxgmsazkpjzuqdiiuzknp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018101.6564198-911-28807481954583/AnsiballZ_stat.py'
Jan 21 17:55:01 compute-0 sudo[92984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:02 compute-0 python3.9[92986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:02 compute-0 sudo[92984]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:02 compute-0 sudo[93107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iadechcvgnkikmilruezqdwxiotwtxya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018101.6564198-911-28807481954583/AnsiballZ_copy.py'
Jan 21 17:55:02 compute-0 sudo[93107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:02 compute-0 python3.9[93109]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018101.6564198-911-28807481954583/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:02 compute-0 sudo[93107]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:03 compute-0 sudo[93259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npyjvhvpzzmshkhljnovuraqkepxaxlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018103.3050075-945-121764364348885/AnsiballZ_file.py'
Jan 21 17:55:03 compute-0 sudo[93259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:03 compute-0 python3.9[93261]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:03 compute-0 sudo[93259]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:04 compute-0 sudo[93411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npiikqiepnodwulefdgfctytftptefsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018104.1044707-961-198587248255584/AnsiballZ_file.py'
Jan 21 17:55:04 compute-0 sudo[93411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:04 compute-0 python3.9[93413]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:04 compute-0 sudo[93411]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:05 compute-0 sudo[93563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxjiqsbcqbpeqnicbphcyzsbonysmfyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018104.8610113-977-55060601399801/AnsiballZ_stat.py'
Jan 21 17:55:05 compute-0 sudo[93563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:05 compute-0 python3.9[93565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:05 compute-0 sudo[93563]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:05 compute-0 sudo[93686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uafmhtiutoiftfoltuajpyfybpfyafdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018104.8610113-977-55060601399801/AnsiballZ_copy.py'
Jan 21 17:55:05 compute-0 sudo[93686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:05 compute-0 python3.9[93688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018104.8610113-977-55060601399801/.source.json _original_basename=.x_8uqgs4 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:06 compute-0 sudo[93686]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:06 compute-0 python3.9[93838]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:08 compute-0 sudo[94259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwhymozshrjacezswsazsjpxgwgxswdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018108.279824-1057-43109243661476/AnsiballZ_container_config_data.py'
Jan 21 17:55:08 compute-0 sudo[94259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:08 compute-0 python3.9[94261]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 21 17:55:08 compute-0 sudo[94259]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:09 compute-0 sudo[94411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pebbzkmzpjtcxcleyvkpplvtuozwhqvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018109.2851934-1079-266093390891506/AnsiballZ_container_config_hash.py'
Jan 21 17:55:09 compute-0 sudo[94411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:09 compute-0 python3.9[94413]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 17:55:09 compute-0 sudo[94411]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:10 compute-0 sudo[94563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrjbwpwkyjuhupibqdahtxltjfarqjcn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018110.5422938-1099-61843416186371/AnsiballZ_edpm_container_manage.py'
Jan 21 17:55:10 compute-0 sudo[94563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:11 compute-0 python3[94565]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 17:55:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:55:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:55:11 compute-0 podman[94603]: 2026-01-21 17:55:11.498377854 +0000 UTC m=+0.061567799 container create 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 21 17:55:11 compute-0 podman[94603]: 2026-01-21 17:55:11.46590209 +0000 UTC m=+0.029092045 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 17:55:11 compute-0 python3[94565]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 17:55:11 compute-0 sudo[94563]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:12 compute-0 sudo[94791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgnorforjykheqogcenjqlayslyecjxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018111.8657382-1115-156259720980382/AnsiballZ_stat.py'
Jan 21 17:55:12 compute-0 sudo[94791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:12 compute-0 python3.9[94793]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:55:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 17:55:12 compute-0 sudo[94791]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:12 compute-0 sudo[94945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdiuyeuigxbxpftmaxcmgrztrsloeyuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018112.59554-1133-118704757081711/AnsiballZ_file.py'
Jan 21 17:55:12 compute-0 sudo[94945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:13 compute-0 python3.9[94947]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:13 compute-0 sudo[94945]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:13 compute-0 sudo[95021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoqxidvccjqymkrzturkzamiantopoek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018112.59554-1133-118704757081711/AnsiballZ_stat.py'
Jan 21 17:55:13 compute-0 sudo[95021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:13 compute-0 python3.9[95023]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:55:13 compute-0 sudo[95021]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:13 compute-0 sudo[95172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eajntfryvycvoorhxbmmfnumwosfbara ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018113.5617447-1133-73102100887316/AnsiballZ_copy.py'
Jan 21 17:55:13 compute-0 sudo[95172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:14 compute-0 python3.9[95174]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769018113.5617447-1133-73102100887316/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:14 compute-0 sudo[95172]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:14 compute-0 sudo[95248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uijbzevhcgqificwmhsmjfkslkquurmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018113.5617447-1133-73102100887316/AnsiballZ_systemd.py'
Jan 21 17:55:14 compute-0 sudo[95248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:14 compute-0 python3.9[95250]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 17:55:14 compute-0 systemd[1]: Reloading.
Jan 21 17:55:14 compute-0 systemd-rc-local-generator[95281]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:55:14 compute-0 systemd-sysv-generator[95285]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:55:15 compute-0 sudo[95248]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:15 compute-0 sudo[95361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqkzijppzfjxbaygbrwnwtvqvltnvmfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018113.5617447-1133-73102100887316/AnsiballZ_systemd.py'
Jan 21 17:55:15 compute-0 sudo[95361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:15 compute-0 python3.9[95363]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:55:15 compute-0 systemd[1]: Reloading.
Jan 21 17:55:15 compute-0 systemd-rc-local-generator[95388]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:55:15 compute-0 systemd-sysv-generator[95391]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:55:15 compute-0 systemd[1]: Starting ovn_controller container...
Jan 21 17:55:16 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 21 17:55:16 compute-0 systemd[1]: Started libcrun container.
Jan 21 17:55:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95d5613fe388b128e93a2cf21cded43eecf075ae863936ba279cbd9d7a796bc0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 21 17:55:16 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a.
Jan 21 17:55:16 compute-0 podman[95403]: 2026-01-21 17:55:16.16726739 +0000 UTC m=+0.150259303 container init 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 21 17:55:16 compute-0 ovn_controller[95419]: + sudo -E kolla_set_configs
Jan 21 17:55:16 compute-0 podman[95403]: 2026-01-21 17:55:16.205979333 +0000 UTC m=+0.188971226 container start 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 17:55:16 compute-0 edpm-start-podman-container[95403]: ovn_controller
Jan 21 17:55:16 compute-0 systemd[1]: Created slice User Slice of UID 0.
Jan 21 17:55:16 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 21 17:55:16 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 21 17:55:16 compute-0 systemd[1]: Starting User Manager for UID 0...
Jan 21 17:55:16 compute-0 systemd[95453]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 21 17:55:16 compute-0 edpm-start-podman-container[95402]: Creating additional drop-in dependency for "ovn_controller" (16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a)
Jan 21 17:55:16 compute-0 podman[95425]: 2026-01-21 17:55:16.306348105 +0000 UTC m=+0.088325377 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 17:55:16 compute-0 systemd[1]: 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a-370cf78da5908003.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 17:55:16 compute-0 systemd[1]: 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a-370cf78da5908003.service: Failed with result 'exit-code'.
Jan 21 17:55:16 compute-0 systemd[1]: Reloading.
Jan 21 17:55:16 compute-0 systemd-rc-local-generator[95508]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:55:16 compute-0 systemd[95453]: Queued start job for default target Main User Target.
Jan 21 17:55:16 compute-0 systemd-sysv-generator[95511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:55:16 compute-0 systemd[95453]: Created slice User Application Slice.
Jan 21 17:55:16 compute-0 systemd[95453]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 21 17:55:16 compute-0 systemd[95453]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 17:55:16 compute-0 systemd[95453]: Reached target Paths.
Jan 21 17:55:16 compute-0 systemd[95453]: Reached target Timers.
Jan 21 17:55:16 compute-0 systemd[95453]: Starting D-Bus User Message Bus Socket...
Jan 21 17:55:16 compute-0 systemd[95453]: Starting Create User's Volatile Files and Directories...
Jan 21 17:55:16 compute-0 systemd[95453]: Finished Create User's Volatile Files and Directories.
Jan 21 17:55:16 compute-0 systemd[95453]: Listening on D-Bus User Message Bus Socket.
Jan 21 17:55:16 compute-0 systemd[95453]: Reached target Sockets.
Jan 21 17:55:16 compute-0 systemd[95453]: Reached target Basic System.
Jan 21 17:55:16 compute-0 systemd[95453]: Reached target Main User Target.
Jan 21 17:55:16 compute-0 systemd[95453]: Startup finished in 124ms.
Jan 21 17:55:16 compute-0 systemd[1]: Started User Manager for UID 0.
Jan 21 17:55:16 compute-0 systemd[1]: Started ovn_controller container.
Jan 21 17:55:16 compute-0 sshd-session[95252]: Invalid user ubuntu from 106.63.7.208 port 37390
Jan 21 17:55:16 compute-0 systemd[1]: Started Session c1 of User root.
Jan 21 17:55:16 compute-0 sudo[95361]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:16 compute-0 ovn_controller[95419]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 17:55:16 compute-0 ovn_controller[95419]: INFO:__main__:Validating config file
Jan 21 17:55:16 compute-0 ovn_controller[95419]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 17:55:16 compute-0 ovn_controller[95419]: INFO:__main__:Writing out command to execute
Jan 21 17:55:16 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 21 17:55:16 compute-0 ovn_controller[95419]: ++ cat /run_command
Jan 21 17:55:16 compute-0 ovn_controller[95419]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 21 17:55:16 compute-0 ovn_controller[95419]: + ARGS=
Jan 21 17:55:16 compute-0 ovn_controller[95419]: + sudo kolla_copy_cacerts
Jan 21 17:55:16 compute-0 systemd[1]: Started Session c2 of User root.
Jan 21 17:55:16 compute-0 ovn_controller[95419]: + [[ ! -n '' ]]
Jan 21 17:55:16 compute-0 ovn_controller[95419]: + . kolla_extend_start
Jan 21 17:55:16 compute-0 ovn_controller[95419]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 21 17:55:16 compute-0 ovn_controller[95419]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 21 17:55:16 compute-0 ovn_controller[95419]: + umask 0022
Jan 21 17:55:16 compute-0 ovn_controller[95419]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 21 17:55:16 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 21 17:55:16 compute-0 NetworkManager[55506]: <info>  [1769018116.7433] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 21 17:55:16 compute-0 NetworkManager[55506]: <info>  [1769018116.7444] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:55:16 compute-0 NetworkManager[55506]: <warn>  [1769018116.7448] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 17:55:16 compute-0 NetworkManager[55506]: <info>  [1769018116.7457] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 21 17:55:16 compute-0 NetworkManager[55506]: <info>  [1769018116.7463] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 21 17:55:16 compute-0 NetworkManager[55506]: <info>  [1769018116.7469] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 21 17:55:16 compute-0 kernel: br-int: entered promiscuous mode
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 17:55:16 compute-0 NetworkManager[55506]: <info>  [1769018116.7715] manager: (ovn-de5c2e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 21 17:55:16 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Jan 21 17:55:16 compute-0 ovn_controller[95419]: 2026-01-21T17:55:16Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 17:55:16 compute-0 NetworkManager[55506]: <info>  [1769018116.7894] device (genev_sys_6081): carrier: link connected
Jan 21 17:55:16 compute-0 NetworkManager[55506]: <info>  [1769018116.7897] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Jan 21 17:55:16 compute-0 systemd-udevd[95554]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:55:16 compute-0 systemd-udevd[95555]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:55:16 compute-0 sshd-session[95252]: Received disconnect from 106.63.7.208 port 37390:11:  [preauth]
Jan 21 17:55:16 compute-0 sshd-session[95252]: Disconnected from invalid user ubuntu 106.63.7.208 port 37390 [preauth]
Jan 21 17:55:19 compute-0 NetworkManager[55506]: <info>  [1769018119.1428] manager: (ovn-88a627-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 21 17:55:19 compute-0 python3.9[95686]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 17:55:20 compute-0 sudo[95836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaowwguqrsyjcwdiouyuhhtnwatslokm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018120.3393984-1223-175472617421514/AnsiballZ_stat.py'
Jan 21 17:55:20 compute-0 sudo[95836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:20 compute-0 python3.9[95838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:20 compute-0 sudo[95836]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:21 compute-0 sudo[95959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oilbrninhytjjcfkqprdeywmkwqmzuhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018120.3393984-1223-175472617421514/AnsiballZ_copy.py'
Jan 21 17:55:21 compute-0 sudo[95959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:21 compute-0 python3.9[95961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018120.3393984-1223-175472617421514/.source.yaml _original_basename=.st0nln53 follow=False checksum=409ec29739f03d404f6e368d453caf23ec03d1e3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:21 compute-0 sudo[95959]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:21 compute-0 sudo[96111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcxneanfkivjiwxngesaekqsrvyzfxbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018121.6792884-1253-48268690812663/AnsiballZ_command.py'
Jan 21 17:55:21 compute-0 sudo[96111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:22 compute-0 python3.9[96113]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:55:22 compute-0 ovs-vsctl[96114]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 21 17:55:22 compute-0 sudo[96111]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:23 compute-0 sudo[96264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgkxzrihjtadiggrqhkqmccmryreeomu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018122.9237053-1269-74303077709854/AnsiballZ_command.py'
Jan 21 17:55:23 compute-0 sudo[96264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:23 compute-0 python3.9[96266]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:55:23 compute-0 ovs-vsctl[96268]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 21 17:55:23 compute-0 sudo[96264]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:24 compute-0 sudo[96419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egfiwcohrsaaqvqeqdcvcersaczrpzeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018123.8783047-1297-51006330076135/AnsiballZ_command.py'
Jan 21 17:55:24 compute-0 sudo[96419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:24 compute-0 python3.9[96421]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:55:24 compute-0 ovs-vsctl[96422]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 21 17:55:24 compute-0 sudo[96419]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:24 compute-0 sshd-session[84934]: Connection closed by 192.168.122.30 port 51232
Jan 21 17:55:24 compute-0 sshd-session[84931]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:55:24 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Jan 21 17:55:24 compute-0 systemd[1]: session-20.scope: Consumed 46.437s CPU time.
Jan 21 17:55:24 compute-0 systemd-logind[782]: Session 20 logged out. Waiting for processes to exit.
Jan 21 17:55:24 compute-0 systemd-logind[782]: Removed session 20.
Jan 21 17:55:26 compute-0 systemd[1]: Stopping User Manager for UID 0...
Jan 21 17:55:26 compute-0 systemd[95453]: Activating special unit Exit the Session...
Jan 21 17:55:26 compute-0 systemd[95453]: Stopped target Main User Target.
Jan 21 17:55:26 compute-0 systemd[95453]: Stopped target Basic System.
Jan 21 17:55:26 compute-0 systemd[95453]: Stopped target Paths.
Jan 21 17:55:26 compute-0 systemd[95453]: Stopped target Sockets.
Jan 21 17:55:26 compute-0 systemd[95453]: Stopped target Timers.
Jan 21 17:55:26 compute-0 systemd[95453]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 17:55:26 compute-0 systemd[95453]: Closed D-Bus User Message Bus Socket.
Jan 21 17:55:26 compute-0 systemd[95453]: Stopped Create User's Volatile Files and Directories.
Jan 21 17:55:26 compute-0 systemd[95453]: Removed slice User Application Slice.
Jan 21 17:55:26 compute-0 systemd[95453]: Reached target Shutdown.
Jan 21 17:55:26 compute-0 systemd[95453]: Finished Exit the Session.
Jan 21 17:55:26 compute-0 systemd[95453]: Reached target Exit the Session.
Jan 21 17:55:26 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Jan 21 17:55:26 compute-0 systemd[1]: Stopped User Manager for UID 0.
Jan 21 17:55:26 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 21 17:55:26 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 21 17:55:26 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 21 17:55:26 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 21 17:55:26 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Jan 21 17:55:27 compute-0 ovn_controller[95419]: 2026-01-21T17:55:27Z|00025|memory|INFO|16256 kB peak resident set size after 10.4 seconds
Jan 21 17:55:27 compute-0 ovn_controller[95419]: 2026-01-21T17:55:27Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 21 17:55:30 compute-0 sshd-session[96449]: Accepted publickey for zuul from 192.168.122.30 port 42146 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:55:30 compute-0 systemd-logind[782]: New session 22 of user zuul.
Jan 21 17:55:30 compute-0 systemd[1]: Started Session 22 of User zuul.
Jan 21 17:55:30 compute-0 sshd-session[96449]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:55:31 compute-0 python3.9[96602]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:55:32 compute-0 sudo[96756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmthdvumnitfirvuwqfkpfaamuiogafw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018131.8220155-43-276189439852995/AnsiballZ_file.py'
Jan 21 17:55:32 compute-0 sudo[96756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:32 compute-0 python3.9[96758]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:32 compute-0 sudo[96756]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:32 compute-0 sudo[96908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jheqapqdyizcmrikfwtlvaentgrnpzpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018132.5794497-43-173179736292025/AnsiballZ_file.py'
Jan 21 17:55:32 compute-0 sudo[96908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:33 compute-0 python3.9[96910]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:33 compute-0 sudo[96908]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:33 compute-0 sudo[97060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwwatklcubftorfezrfnarfhbrkcrvby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018133.2228873-43-264927462376626/AnsiballZ_file.py'
Jan 21 17:55:33 compute-0 sudo[97060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:33 compute-0 python3.9[97062]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:33 compute-0 sudo[97060]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:34 compute-0 sudo[97212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvuevcsvexnewcgpitguhmqfqwuygnxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018133.8497388-43-185797684534940/AnsiballZ_file.py'
Jan 21 17:55:34 compute-0 sudo[97212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:34 compute-0 python3.9[97214]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:34 compute-0 sudo[97212]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:34 compute-0 sudo[97365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlukbdldwnpuqgbvxhqbgcxjdyhwtieq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018134.629516-43-158525942322542/AnsiballZ_file.py'
Jan 21 17:55:34 compute-0 sudo[97365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:35 compute-0 python3.9[97367]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:35 compute-0 sudo[97365]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:36 compute-0 python3.9[97517]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:55:36 compute-0 sudo[97667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcxjagziyehylfwancsabzslcpjgmpzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018136.2951689-131-51151247027274/AnsiballZ_seboolean.py'
Jan 21 17:55:36 compute-0 sudo[97667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:36 compute-0 python3.9[97669]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 21 17:55:37 compute-0 sudo[97667]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:38 compute-0 python3.9[97819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:38 compute-0 python3.9[97940]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018137.7801304-147-2225473637325/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:39 compute-0 python3.9[98090]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:40 compute-0 python3.9[98211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018139.2937331-177-43023467917230/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:41 compute-0 sudo[98361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swbcvgkxezdwajuptjvyhtnvgwbnkpui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018140.7276707-211-212815755623326/AnsiballZ_setup.py'
Jan 21 17:55:41 compute-0 sudo[98361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:41 compute-0 python3.9[98363]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:55:41 compute-0 sudo[98361]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:42 compute-0 sudo[98445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmtcrlpilwgwocvmaziuxyowwuvxyelp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018140.7276707-211-212815755623326/AnsiballZ_dnf.py'
Jan 21 17:55:42 compute-0 sudo[98445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:42 compute-0 python3.9[98447]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:55:43 compute-0 sudo[98445]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:44 compute-0 sudo[98598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmtsavizxizcailluupyhloxwxbicjge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018143.7730308-235-197939242623549/AnsiballZ_systemd.py'
Jan 21 17:55:44 compute-0 sudo[98598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:44 compute-0 python3.9[98600]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 17:55:44 compute-0 sudo[98598]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:45 compute-0 python3.9[98753]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:45 compute-0 python3.9[98874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018144.9988105-251-217940322665517/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:46 compute-0 python3.9[99024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:46 compute-0 podman[99119]: 2026-01-21 17:55:46.817111467 +0000 UTC m=+0.094978245 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 21 17:55:46 compute-0 python3.9[99156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018146.0227213-251-276598751855558/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:48 compute-0 python3.9[99322]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:48 compute-0 python3.9[99443]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018147.9242568-339-53523706674740/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:49 compute-0 python3.9[99593]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:50 compute-0 python3.9[99714]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018149.0610943-339-238623133685281/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:51 compute-0 python3.9[99864]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:55:51 compute-0 sshd-session[99891]: Invalid user ansible_user from 64.227.98.100 port 57794
Jan 21 17:55:51 compute-0 sshd-session[99891]: Connection closed by invalid user ansible_user 64.227.98.100 port 57794 [preauth]
Jan 21 17:55:51 compute-0 sudo[100018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwssespuwwxhkqofiervvdedkytqfecr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018151.3041372-415-172396417965008/AnsiballZ_file.py'
Jan 21 17:55:51 compute-0 sudo[100018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:51 compute-0 python3.9[100020]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:51 compute-0 sudo[100018]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:52 compute-0 sudo[100170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smxbbkjewkstbbyukrsecdpzgvdshnzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018152.0365367-431-278715793036475/AnsiballZ_stat.py'
Jan 21 17:55:52 compute-0 sudo[100170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:52 compute-0 python3.9[100172]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:52 compute-0 sudo[100170]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:52 compute-0 sudo[100248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uybsajpbntkvehfubkgfudbtftotqgwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018152.0365367-431-278715793036475/AnsiballZ_file.py'
Jan 21 17:55:52 compute-0 sudo[100248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:52 compute-0 python3.9[100250]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:52 compute-0 sudo[100248]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:53 compute-0 sudo[100400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgkhuzvswgsyrvuwlynnoetusfdxdlpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018153.0444956-431-40932800974423/AnsiballZ_stat.py'
Jan 21 17:55:53 compute-0 sudo[100400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:53 compute-0 python3.9[100402]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:53 compute-0 sudo[100400]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:53 compute-0 sudo[100478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtokgoehtjtpobzkgmqupbobqulbgeiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018153.0444956-431-40932800974423/AnsiballZ_file.py'
Jan 21 17:55:53 compute-0 sudo[100478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:53 compute-0 python3.9[100480]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:55:53 compute-0 sudo[100478]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:54 compute-0 sudo[100630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcrzjdiuzpchvpthzmlmniumcxgdbobr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018154.1714919-477-40341599355449/AnsiballZ_file.py'
Jan 21 17:55:54 compute-0 sudo[100630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:54 compute-0 python3.9[100632]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:54 compute-0 sudo[100630]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:55 compute-0 sudo[100782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-admfcrlcxpcruuoxhoindbzwceikgiuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018154.8973608-493-270350717893712/AnsiballZ_stat.py'
Jan 21 17:55:55 compute-0 sudo[100782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:55 compute-0 python3.9[100784]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:55 compute-0 sudo[100782]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:55 compute-0 sudo[100860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntcmcsswpindefzxoryqbqpvgjcmfxos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018154.8973608-493-270350717893712/AnsiballZ_file.py'
Jan 21 17:55:55 compute-0 sudo[100860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:55 compute-0 python3.9[100862]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:55 compute-0 sudo[100860]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:56 compute-0 sudo[101012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mafpbxlbjiobyhegvjqnmqkoryypqxpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018156.144093-517-198337089451945/AnsiballZ_stat.py'
Jan 21 17:55:56 compute-0 sudo[101012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:56 compute-0 python3.9[101014]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:56 compute-0 sudo[101012]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:56 compute-0 sudo[101090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgcwpcplmbzznhdwoustqsgrmvlkjgjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018156.144093-517-198337089451945/AnsiballZ_file.py'
Jan 21 17:55:56 compute-0 sudo[101090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:57 compute-0 python3.9[101092]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:57 compute-0 sudo[101090]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:57 compute-0 sudo[101242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvdsbjikkpierfvuxsllqqtgcootkris ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018157.2057858-541-211570820623801/AnsiballZ_systemd.py'
Jan 21 17:55:57 compute-0 sudo[101242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:57 compute-0 python3.9[101244]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:55:57 compute-0 systemd[1]: Reloading.
Jan 21 17:55:57 compute-0 systemd-rc-local-generator[101271]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:55:57 compute-0 systemd-sysv-generator[101274]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:55:58 compute-0 sudo[101242]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:58 compute-0 sudo[101430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrqmlsaaprklbuxkqodaistrsaqdvezx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018158.3448844-557-166533140022240/AnsiballZ_stat.py'
Jan 21 17:55:58 compute-0 sudo[101430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:58 compute-0 python3.9[101432]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:58 compute-0 sudo[101430]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:59 compute-0 sudo[101508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qutzokspovkzrcwrbzcduerbrzvmyfly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018158.3448844-557-166533140022240/AnsiballZ_file.py'
Jan 21 17:55:59 compute-0 sudo[101508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:59 compute-0 python3.9[101510]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:59 compute-0 sudo[101508]: pam_unix(sudo:session): session closed for user root
Jan 21 17:55:59 compute-0 sudo[101660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orwbdhydujdlyaljmpltvixmjrgvjcav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018159.4593213-581-256818298327191/AnsiballZ_stat.py'
Jan 21 17:55:59 compute-0 sudo[101660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:55:59 compute-0 python3.9[101662]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:55:59 compute-0 sudo[101660]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:00 compute-0 sudo[101738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzvcezriqxrmgluhxzcijrqovrzktyyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018159.4593213-581-256818298327191/AnsiballZ_file.py'
Jan 21 17:56:00 compute-0 sudo[101738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:00 compute-0 python3.9[101740]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:00 compute-0 sudo[101738]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:01 compute-0 sudo[101890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lngshyoxblytgchzqhtxsasrhmzmggvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018160.7864869-605-40785516242823/AnsiballZ_systemd.py'
Jan 21 17:56:01 compute-0 sudo[101890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:01 compute-0 python3.9[101892]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:56:01 compute-0 systemd[1]: Reloading.
Jan 21 17:56:01 compute-0 systemd-rc-local-generator[101918]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:56:01 compute-0 systemd-sysv-generator[101922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:56:01 compute-0 systemd[1]: Starting Create netns directory...
Jan 21 17:56:01 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 17:56:01 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 17:56:01 compute-0 systemd[1]: Finished Create netns directory.
Jan 21 17:56:01 compute-0 sudo[101890]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:02 compute-0 sudo[102085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkpbdryfrsjizndcmhvpbaqgazorepun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018162.3100939-625-244401329286436/AnsiballZ_file.py'
Jan 21 17:56:02 compute-0 sudo[102085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:02 compute-0 python3.9[102087]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:56:02 compute-0 sudo[102085]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:03 compute-0 sudo[102237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeaywpjfyxqtfnmlyfzqezvxwwlbcxxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018163.0445955-641-126155587971498/AnsiballZ_stat.py'
Jan 21 17:56:03 compute-0 sudo[102237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:03 compute-0 python3.9[102239]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:56:03 compute-0 sudo[102237]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:04 compute-0 sudo[102360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axashjmbhwahhthnkilfxipivkmxmdli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018163.0445955-641-126155587971498/AnsiballZ_copy.py'
Jan 21 17:56:04 compute-0 sudo[102360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:04 compute-0 python3.9[102362]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018163.0445955-641-126155587971498/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:56:04 compute-0 sudo[102360]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:04 compute-0 sudo[102512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esefqtmsvmvgykionkodmwmtvigyhppp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018164.7402637-675-82129962418096/AnsiballZ_file.py'
Jan 21 17:56:04 compute-0 sudo[102512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:05 compute-0 python3.9[102514]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:05 compute-0 sudo[102512]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:05 compute-0 sudo[102664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trqwvpeuwknykcqvrxjpepfrrctjyhfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018165.4671497-691-121178537048166/AnsiballZ_file.py'
Jan 21 17:56:05 compute-0 sudo[102664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:05 compute-0 python3.9[102666]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 17:56:05 compute-0 sudo[102664]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:06 compute-0 sudo[102816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqwtoxjsfgcawggsqajarboykmbczvtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018166.1310146-707-91666337891072/AnsiballZ_stat.py'
Jan 21 17:56:06 compute-0 sudo[102816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:06 compute-0 python3.9[102818]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:56:06 compute-0 sudo[102816]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:07 compute-0 sudo[102939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nidmuzxkwtolqmbftuhksiwtrryofxla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018166.1310146-707-91666337891072/AnsiballZ_copy.py'
Jan 21 17:56:07 compute-0 sudo[102939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:07 compute-0 python3.9[102941]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018166.1310146-707-91666337891072/.source.json _original_basename=.5xkn7j9z follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:07 compute-0 sudo[102939]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:08 compute-0 python3.9[103091]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:11 compute-0 sudo[103512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbwnpbxbxyyplferocycytlqrtwjahja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018170.6233308-787-49939578657448/AnsiballZ_container_config_data.py'
Jan 21 17:56:11 compute-0 sudo[103512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:11 compute-0 python3.9[103514]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 21 17:56:11 compute-0 sudo[103512]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:12 compute-0 sudo[103664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvbajxhrdflgigbmafqrfiqutaawsczb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018171.695396-809-29395497860005/AnsiballZ_container_config_hash.py'
Jan 21 17:56:12 compute-0 sudo[103664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:12 compute-0 python3.9[103666]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 17:56:12 compute-0 sudo[103664]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:13 compute-0 sudo[103816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwbekstklzxoskpcdcnoukmjfkothvji ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018172.6802962-829-198776599780872/AnsiballZ_edpm_container_manage.py'
Jan 21 17:56:13 compute-0 sudo[103816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:13 compute-0 python3[103818]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 17:56:13 compute-0 podman[103854]: 2026-01-21 17:56:13.573797595 +0000 UTC m=+0.051177887 container create db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 21 17:56:13 compute-0 podman[103854]: 2026-01-21 17:56:13.544844375 +0000 UTC m=+0.022224877 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 17:56:13 compute-0 python3[103818]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 17:56:13 compute-0 sudo[103816]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:14 compute-0 sudo[104042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpkvovodgfcsmtoslhzctjzwygwrgbzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018173.9780118-845-63905971271894/AnsiballZ_stat.py'
Jan 21 17:56:14 compute-0 sudo[104042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:14 compute-0 python3.9[104044]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:56:14 compute-0 sudo[104042]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:15 compute-0 sudo[104196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwwkravzedcoynhdedrrrommebsycihy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018174.774945-863-47187911350287/AnsiballZ_file.py'
Jan 21 17:56:15 compute-0 sudo[104196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:15 compute-0 python3.9[104198]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:15 compute-0 sudo[104196]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:15 compute-0 sudo[104272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgabnbwakfutqczmobsujvhkidatrsda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018174.774945-863-47187911350287/AnsiballZ_stat.py'
Jan 21 17:56:15 compute-0 sudo[104272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:15 compute-0 python3.9[104274]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 17:56:15 compute-0 sudo[104272]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:16 compute-0 sudo[104423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwvbuylioaofitdxzloitugasmbuooil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018175.7863128-863-180209500863762/AnsiballZ_copy.py'
Jan 21 17:56:16 compute-0 sudo[104423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:16 compute-0 python3.9[104425]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769018175.7863128-863-180209500863762/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:16 compute-0 sudo[104423]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:16 compute-0 sudo[104499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhrvzsplksliykoczpwxmqtbfdnikggn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018175.7863128-863-180209500863762/AnsiballZ_systemd.py'
Jan 21 17:56:16 compute-0 sudo[104499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:16 compute-0 python3.9[104501]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 17:56:16 compute-0 systemd[1]: Reloading.
Jan 21 17:56:17 compute-0 systemd-rc-local-generator[104545]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:56:17 compute-0 systemd-sysv-generator[104548]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:56:17 compute-0 podman[104502]: 2026-01-21 17:56:17.04390298 +0000 UTC m=+0.089289665 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 21 17:56:17 compute-0 sudo[104499]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:17 compute-0 sudo[104634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-actfmtfalelokjxhzvmojmdlraxcuufc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018175.7863128-863-180209500863762/AnsiballZ_systemd.py'
Jan 21 17:56:17 compute-0 sudo[104634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:17 compute-0 python3.9[104636]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:56:17 compute-0 systemd[1]: Reloading.
Jan 21 17:56:17 compute-0 systemd-rc-local-generator[104667]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:56:17 compute-0 systemd-sysv-generator[104672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:56:17 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Jan 21 17:56:18 compute-0 systemd[1]: Started libcrun container.
Jan 21 17:56:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a612acc63260f882e484b9bcab2310d31e3b6f45752b867d6f20a25e26bcdbb/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 21 17:56:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a612acc63260f882e484b9bcab2310d31e3b6f45752b867d6f20a25e26bcdbb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 17:56:18 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456.
Jan 21 17:56:18 compute-0 podman[104677]: 2026-01-21 17:56:18.095650134 +0000 UTC m=+0.111821068 container init db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: + sudo -E kolla_set_configs
Jan 21 17:56:18 compute-0 podman[104677]: 2026-01-21 17:56:18.125785361 +0000 UTC m=+0.141956275 container start db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 17:56:18 compute-0 edpm-start-podman-container[104677]: ovn_metadata_agent
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Validating config file
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Copying service configuration files
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Writing out command to execute
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 21 17:56:18 compute-0 edpm-start-podman-container[104676]: Creating additional drop-in dependency for "ovn_metadata_agent" (db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456)
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: ++ cat /run_command
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: + CMD=neutron-ovn-metadata-agent
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: + ARGS=
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: + sudo kolla_copy_cacerts
Jan 21 17:56:18 compute-0 podman[104700]: 2026-01-21 17:56:18.194359413 +0000 UTC m=+0.055701270 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 21 17:56:18 compute-0 systemd[1]: Reloading.
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: + [[ ! -n '' ]]
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: + . kolla_extend_start
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: Running command: 'neutron-ovn-metadata-agent'
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: + umask 0022
Jan 21 17:56:18 compute-0 ovn_metadata_agent[104693]: + exec neutron-ovn-metadata-agent
Jan 21 17:56:18 compute-0 systemd-rc-local-generator[104765]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:56:18 compute-0 systemd-sysv-generator[104774]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:56:18 compute-0 systemd[1]: Started ovn_metadata_agent container.
Jan 21 17:56:18 compute-0 sudo[104634]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:19 compute-0 python3.9[104930]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.009 104698 INFO neutron.common.config [-] Logging enabled!
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.009 104698 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.009 104698 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.010 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.010 104698 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.010 104698 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.010 104698 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.011 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.011 104698 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.011 104698 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.011 104698 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.011 104698 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.011 104698 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.011 104698 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.012 104698 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.012 104698 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.012 104698 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.012 104698 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.012 104698 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.012 104698 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.012 104698 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.013 104698 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.013 104698 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.013 104698 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.013 104698 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.013 104698 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.013 104698 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.013 104698 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.014 104698 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.014 104698 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.014 104698 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.014 104698 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.014 104698 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.014 104698 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.014 104698 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.015 104698 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.015 104698 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.015 104698 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.015 104698 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.015 104698 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.015 104698 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.015 104698 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.015 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.016 104698 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.017 104698 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.017 104698 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.017 104698 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.017 104698 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.017 104698 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.017 104698 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.017 104698 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.017 104698 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.017 104698 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.018 104698 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.018 104698 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.018 104698 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.018 104698 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.018 104698 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.018 104698 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.018 104698 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.018 104698 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.018 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.019 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.019 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.019 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.019 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.019 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.019 104698 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.019 104698 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.019 104698 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.019 104698 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.020 104698 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.020 104698 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.020 104698 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.020 104698 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.020 104698 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.020 104698 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.020 104698 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.020 104698 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.020 104698 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.021 104698 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.022 104698 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.022 104698 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.022 104698 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.022 104698 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.022 104698 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.022 104698 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.022 104698 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.022 104698 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.023 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.023 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.023 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.023 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.023 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.023 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.023 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.024 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.024 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.024 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.024 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.024 104698 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.024 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.024 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.025 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.025 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.025 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.025 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.025 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.025 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.026 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.026 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.026 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.026 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.026 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.026 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.026 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.027 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.027 104698 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.027 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.027 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.027 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.027 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.027 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.027 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.028 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.028 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.028 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.028 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.028 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.028 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.028 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.028 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.028 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.029 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.029 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.029 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.029 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.029 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.029 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.029 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.029 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.029 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.030 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.030 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.030 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.030 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.030 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.030 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.030 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.030 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.030 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.031 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.031 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.031 104698 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.031 104698 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.031 104698 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.031 104698 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.031 104698 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.031 104698 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.031 104698 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.032 104698 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.032 104698 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.032 104698 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.032 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.032 104698 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.032 104698 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.032 104698 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.032 104698 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.032 104698 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.033 104698 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.033 104698 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.033 104698 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.033 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.033 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.033 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.033 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.033 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.033 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.034 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.034 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.034 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.034 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.034 104698 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.034 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.034 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.034 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.034 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.035 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.035 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.035 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.035 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.035 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.035 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.035 104698 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.035 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.035 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.036 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.036 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.036 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.036 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.036 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.036 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.036 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.036 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.036 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.037 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.037 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.037 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.037 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.037 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.037 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.037 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.038 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.038 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.038 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.038 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.038 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.038 104698 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.038 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.039 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.039 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.039 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.039 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.039 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.039 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.040 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.040 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.040 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.040 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.040 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.040 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.040 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.040 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.041 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.041 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.041 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.041 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.041 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.041 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.041 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.041 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.042 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.042 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.042 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.042 104698 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.042 104698 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.042 104698 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.042 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.042 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.042 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.043 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.043 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.043 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.043 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.043 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.043 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.043 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.043 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.043 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.044 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.044 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.044 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.044 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.044 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.044 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.044 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.044 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.044 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.045 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.045 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.045 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.045 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.045 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.045 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.045 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.045 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.046 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.046 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.046 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.046 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.046 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.046 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.046 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.046 104698 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.046 104698 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.056 104698 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.056 104698 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.057 104698 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.057 104698 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.057 104698 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.069 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name a4db021d-a451-4e5f-8011-49af760bda68 (UUID: a4db021d-a451-4e5f-8011-49af760bda68) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.096 104698 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.097 104698 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.097 104698 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.097 104698 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.102 104698 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.108 104698 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.114 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'a4db021d-a451-4e5f-8011-49af760bda68'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], external_ids={}, name=a4db021d-a451-4e5f-8011-49af760bda68, nb_cfg_timestamp=1769018124770, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.115 104698 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f45e8ecab50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.115 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.116 104698 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.116 104698 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.116 104698 INFO oslo_service.service [-] Starting 1 workers
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.120 104698 DEBUG oslo_service.service [-] Started child 104955 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.123 104698 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmps_ek2wrl/privsep.sock']
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.123 104955 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-2094749'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.146 104955 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.147 104955 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.147 104955 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.151 104955 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.160 104955 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.168 104955 INFO eventlet.wsgi.server [-] (104955) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 21 17:56:20 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 21 17:56:20 compute-0 sudo[105086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnljzwmxkefwsxmmqrabtwezssxditjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018180.468371-953-72948425937844/AnsiballZ_stat.py'
Jan 21 17:56:20 compute-0 sudo[105086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.817 104698 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.817 104698 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmps_ek2wrl/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.688 105036 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.695 105036 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.699 105036 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.700 105036 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105036
Jan 21 17:56:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:20.820 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[8b66d3f1-e4c9-4acf-b4e0-1c7bced33aa3]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 17:56:20 compute-0 python3.9[105088]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 17:56:21 compute-0 sudo[105086]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.376 105036 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.376 105036 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.376 105036 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 17:56:21 compute-0 sudo[105215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exxklviugosjmbxoqckjcdnolqekilzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018180.468371-953-72948425937844/AnsiballZ_copy.py'
Jan 21 17:56:21 compute-0 sudo[105215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:21 compute-0 python3.9[105217]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018180.468371-953-72948425937844/.source.yaml _original_basename=.7lll4i6i follow=False checksum=84b676b4ec2cae5a9ddf6c6064ff2532f9175535 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:21 compute-0 sudo[105215]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.913 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[09ba56d7-1724-4d98-9d16-bfe285a6def3]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.915 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, column=external_ids, values=({'neutron:ovn-metadata-id': 'b4634a2a-fe06-5319-af16-12e58bfe2bb1'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.935 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.940 104698 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.940 104698 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.941 104698 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.941 104698 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.941 104698 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.941 104698 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.941 104698 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.941 104698 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.941 104698 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.941 104698 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.942 104698 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.942 104698 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.942 104698 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.942 104698 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.942 104698 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.942 104698 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.942 104698 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.942 104698 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.942 104698 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.943 104698 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.943 104698 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.943 104698 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.943 104698 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.943 104698 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.943 104698 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.943 104698 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.943 104698 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.943 104698 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.944 104698 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.944 104698 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.944 104698 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.944 104698 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.944 104698 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.944 104698 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.944 104698 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.944 104698 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.945 104698 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.945 104698 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.945 104698 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.945 104698 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.945 104698 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.945 104698 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.945 104698 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.945 104698 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.946 104698 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.947 104698 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.948 104698 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.948 104698 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.948 104698 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.948 104698 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.948 104698 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.948 104698 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.948 104698 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.948 104698 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.948 104698 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.949 104698 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.949 104698 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.949 104698 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.949 104698 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.949 104698 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.949 104698 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.949 104698 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.949 104698 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.949 104698 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.950 104698 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.950 104698 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.950 104698 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.950 104698 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.950 104698 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.950 104698 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.950 104698 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.950 104698 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.950 104698 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.951 104698 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.952 104698 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.952 104698 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.952 104698 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.952 104698 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.952 104698 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.952 104698 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.952 104698 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.952 104698 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.952 104698 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.953 104698 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.953 104698 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.953 104698 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.953 104698 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.953 104698 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.953 104698 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.953 104698 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.953 104698 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.954 104698 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.954 104698 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.954 104698 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.954 104698 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.954 104698 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.954 104698 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.954 104698 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.954 104698 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.954 104698 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.955 104698 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.955 104698 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.955 104698 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.955 104698 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.955 104698 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.955 104698 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.955 104698 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.955 104698 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.955 104698 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.956 104698 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.957 104698 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.957 104698 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.957 104698 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.957 104698 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.957 104698 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.957 104698 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.957 104698 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.957 104698 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.957 104698 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.958 104698 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.959 104698 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.960 104698 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.960 104698 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.960 104698 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.960 104698 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.960 104698 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.960 104698 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.960 104698 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.960 104698 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.960 104698 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.961 104698 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.962 104698 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.963 104698 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.964 104698 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.964 104698 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.964 104698 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.964 104698 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.964 104698 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.964 104698 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.964 104698 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.964 104698 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.965 104698 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.965 104698 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.965 104698 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.965 104698 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.965 104698 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.965 104698 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.965 104698 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.965 104698 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.965 104698 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.966 104698 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.966 104698 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.966 104698 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.966 104698 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.966 104698 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.966 104698 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.966 104698 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.966 104698 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.966 104698 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.967 104698 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.967 104698 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.967 104698 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.967 104698 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.967 104698 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.967 104698 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.967 104698 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.967 104698 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.967 104698 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.968 104698 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.968 104698 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.968 104698 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.968 104698 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.968 104698 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.968 104698 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.968 104698 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.968 104698 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.968 104698 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.969 104698 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.969 104698 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.969 104698 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.969 104698 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.969 104698 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.969 104698 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.969 104698 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.969 104698 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.969 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.970 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.970 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.970 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.970 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.970 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.970 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.970 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.970 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.971 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.971 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.971 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.971 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.971 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.971 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.971 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.972 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.972 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 sshd-session[96452]: Connection closed by 192.168.122.30 port 42146
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.972 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.972 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.972 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.972 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.972 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.972 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.972 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.973 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.973 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.973 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.973 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.973 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.973 104698 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.973 104698 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.973 104698 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.973 104698 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.974 104698 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 17:56:21 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:56:21.974 104698 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 17:56:21 compute-0 sshd-session[96449]: pam_unix(sshd:session): session closed for user zuul
Jan 21 17:56:21 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Jan 21 17:56:21 compute-0 systemd[1]: session-22.scope: Consumed 34.325s CPU time.
Jan 21 17:56:21 compute-0 systemd-logind[782]: Session 22 logged out. Waiting for processes to exit.
Jan 21 17:56:21 compute-0 systemd-logind[782]: Removed session 22.
Jan 21 17:56:26 compute-0 sshd-session[105242]: Accepted publickey for zuul from 192.168.122.30 port 49994 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 17:56:26 compute-0 systemd-logind[782]: New session 23 of user zuul.
Jan 21 17:56:27 compute-0 systemd[1]: Started Session 23 of User zuul.
Jan 21 17:56:27 compute-0 sshd-session[105242]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 17:56:27 compute-0 python3.9[105395]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:56:32 compute-0 sudo[105549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbddazoaxwblmhtbtqfcmqwxutzgjqet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018192.4210205-43-92045858465210/AnsiballZ_command.py'
Jan 21 17:56:32 compute-0 sudo[105549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:33 compute-0 python3.9[105551]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:56:33 compute-0 sudo[105549]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:34 compute-0 sudo[105714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syexkzssyflacpspeczewzvsgsrrqmmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018193.561263-65-42746868294798/AnsiballZ_systemd_service.py'
Jan 21 17:56:34 compute-0 sudo[105714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:34 compute-0 python3.9[105716]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 17:56:34 compute-0 systemd[1]: Reloading.
Jan 21 17:56:34 compute-0 systemd-rc-local-generator[105740]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:56:34 compute-0 systemd-sysv-generator[105745]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:56:34 compute-0 sudo[105714]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:35 compute-0 python3.9[105902]: ansible-ansible.builtin.service_facts Invoked
Jan 21 17:56:35 compute-0 network[105919]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 17:56:35 compute-0 network[105920]: 'network-scripts' will be removed from distribution in near future.
Jan 21 17:56:35 compute-0 network[105921]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 17:56:39 compute-0 sudo[106180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieoorclihaoyyzvyptlxpfrvhmoftivh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018199.3785813-103-32824255387860/AnsiballZ_systemd_service.py'
Jan 21 17:56:39 compute-0 sudo[106180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:39 compute-0 python3.9[106182]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:56:40 compute-0 sudo[106180]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:40 compute-0 sudo[106333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfoeptbcqksfuuiaglarhuopcfwoenyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018200.1295645-103-155262631723308/AnsiballZ_systemd_service.py'
Jan 21 17:56:40 compute-0 sudo[106333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:40 compute-0 python3.9[106335]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:56:40 compute-0 sudo[106333]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:41 compute-0 sudo[106486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daijdmwqtkhjjffhwpetgikrcemrfzzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018200.839076-103-234125216572807/AnsiballZ_systemd_service.py'
Jan 21 17:56:41 compute-0 sudo[106486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:41 compute-0 python3.9[106488]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:56:41 compute-0 sudo[106486]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:41 compute-0 sudo[106639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mghcgkaqypjnafzmmgopjuwddzizwoeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018201.5613008-103-107867525697912/AnsiballZ_systemd_service.py'
Jan 21 17:56:41 compute-0 sudo[106639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:42 compute-0 python3.9[106641]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:56:42 compute-0 sudo[106639]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:42 compute-0 sudo[106792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fytkethmyzarhylyhplwchlwrxhruqyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018202.3375657-103-95736098325600/AnsiballZ_systemd_service.py'
Jan 21 17:56:42 compute-0 sudo[106792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:42 compute-0 python3.9[106794]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:56:42 compute-0 sudo[106792]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:43 compute-0 sudo[106945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unqwtryjtfvpkgobcejfqqenkrcgpbsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018203.0829978-103-103459476831229/AnsiballZ_systemd_service.py'
Jan 21 17:56:43 compute-0 sudo[106945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:43 compute-0 python3.9[106947]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:56:43 compute-0 sudo[106945]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:44 compute-0 sudo[107098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cssomxydydmzgovhauqmettzuoacabba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018203.846161-103-200170618452912/AnsiballZ_systemd_service.py'
Jan 21 17:56:44 compute-0 sudo[107098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:44 compute-0 python3.9[107100]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 17:56:44 compute-0 sudo[107098]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:46 compute-0 sudo[107251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poyrwuyphkqokepgczmlmmlhtcybtruo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018206.061009-207-159351703498792/AnsiballZ_file.py'
Jan 21 17:56:46 compute-0 sudo[107251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:46 compute-0 python3.9[107253]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:46 compute-0 sudo[107251]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:47 compute-0 sudo[107403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goqascptzcrlrvdwwcspddfvpryacpyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018206.8462532-207-238724246948681/AnsiballZ_file.py'
Jan 21 17:56:47 compute-0 sudo[107403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:47 compute-0 python3.9[107405]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:47 compute-0 sudo[107403]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:47 compute-0 sudo[107573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgfudxdkbibtddibvgsbdfdnlpcxlzxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018207.5385144-207-56118080911388/AnsiballZ_file.py'
Jan 21 17:56:47 compute-0 sudo[107573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:47 compute-0 podman[107529]: 2026-01-21 17:56:47.891076626 +0000 UTC m=+0.090803800 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 17:56:48 compute-0 python3.9[107578]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:48 compute-0 sudo[107573]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:48 compute-0 sudo[107741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrplxvwhovnspjtofedadbiafbqeyujd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018208.2055597-207-100025193784353/AnsiballZ_file.py'
Jan 21 17:56:48 compute-0 sudo[107741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:48 compute-0 podman[107708]: 2026-01-21 17:56:48.48113869 +0000 UTC m=+0.065817960 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 21 17:56:48 compute-0 python3.9[107746]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:48 compute-0 sudo[107741]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:49 compute-0 sudo[107903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbwrtuunccrmynjnblyrazeteuyprfie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018208.7773995-207-186288056484868/AnsiballZ_file.py'
Jan 21 17:56:49 compute-0 sudo[107903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:49 compute-0 python3.9[107905]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:49 compute-0 sudo[107903]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:49 compute-0 sudo[108055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgcgyoobhykokeigysgdgejpjrzeuote ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018209.3691742-207-4566597156881/AnsiballZ_file.py'
Jan 21 17:56:49 compute-0 sudo[108055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:49 compute-0 python3.9[108057]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:49 compute-0 sudo[108055]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:50 compute-0 sudo[108207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izeoryowzhmtaykxczxfsxvwjxqbyejj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018209.9473772-207-80414466467558/AnsiballZ_file.py'
Jan 21 17:56:50 compute-0 sudo[108207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:50 compute-0 python3.9[108209]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:50 compute-0 sudo[108207]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:51 compute-0 sudo[108359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwbkoxppvkeayncmzuelbtsjxiasipde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018211.4461706-307-228073193673712/AnsiballZ_file.py'
Jan 21 17:56:51 compute-0 sudo[108359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:51 compute-0 python3.9[108361]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:51 compute-0 sudo[108359]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:52 compute-0 sudo[108511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdmkrorjilbsyvwidojzumorrtnpxapz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018211.97076-307-30448252891333/AnsiballZ_file.py'
Jan 21 17:56:52 compute-0 sudo[108511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:52 compute-0 python3.9[108513]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:52 compute-0 sudo[108511]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:52 compute-0 sudo[108663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbjojfoxqmzxqasmovytuhvehuefxcmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018212.5067697-307-148256135084503/AnsiballZ_file.py'
Jan 21 17:56:52 compute-0 sudo[108663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:52 compute-0 python3.9[108665]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:52 compute-0 sudo[108663]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:53 compute-0 sudo[108815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmqwftatxkfrmfdhmqnsqfyxvtbcimtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018213.0544913-307-243481831337044/AnsiballZ_file.py'
Jan 21 17:56:53 compute-0 sudo[108815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:53 compute-0 python3.9[108817]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:53 compute-0 sudo[108815]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:53 compute-0 sudo[108967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbrupolezifmqksdodnlvkmtilcujdnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018213.6428485-307-208385103319806/AnsiballZ_file.py'
Jan 21 17:56:53 compute-0 sudo[108967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:54 compute-0 python3.9[108969]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:54 compute-0 sudo[108967]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:54 compute-0 sudo[109119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkzssypxeirgzlwlueshagakmzaxxsce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018214.2734523-307-66392039649115/AnsiballZ_file.py'
Jan 21 17:56:54 compute-0 sudo[109119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:54 compute-0 python3.9[109121]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:54 compute-0 sudo[109119]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:55 compute-0 sudo[109271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtxottklzkxolwqkadvkozgunygsthhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018214.8591864-307-214813602162219/AnsiballZ_file.py'
Jan 21 17:56:55 compute-0 sudo[109271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:55 compute-0 python3.9[109273]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:55 compute-0 sudo[109271]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:56 compute-0 sudo[109423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufjksxxxayfnqzydlafchininpwavigm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018216.4803972-409-106167736022790/AnsiballZ_command.py'
Jan 21 17:56:56 compute-0 sudo[109423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:56 compute-0 python3.9[109425]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:56:56 compute-0 sudo[109423]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:57 compute-0 python3.9[109577]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 17:56:58 compute-0 sudo[109727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysmzupgicynplrvkkdvdcqipxbegkici ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018218.135646-445-172225257738635/AnsiballZ_systemd_service.py'
Jan 21 17:56:58 compute-0 sudo[109727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:58 compute-0 python3.9[109729]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 17:56:58 compute-0 systemd[1]: Reloading.
Jan 21 17:56:58 compute-0 systemd-sysv-generator[109760]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:56:58 compute-0 systemd-rc-local-generator[109756]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:56:58 compute-0 sudo[109727]: pam_unix(sudo:session): session closed for user root
Jan 21 17:56:59 compute-0 sudo[109914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnvxhxzmmgoxpgblykcjumzhpbykulbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018219.2349656-461-172561047952352/AnsiballZ_command.py'
Jan 21 17:56:59 compute-0 sudo[109914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:56:59 compute-0 python3.9[109916]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:56:59 compute-0 sudo[109914]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:00 compute-0 sudo[110067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bveybrwlsdwqavyrwcvesgnejfvhmyns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018219.855154-461-85834306406210/AnsiballZ_command.py'
Jan 21 17:57:00 compute-0 sudo[110067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:00 compute-0 python3.9[110069]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:57:00 compute-0 sudo[110067]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:01 compute-0 sudo[110220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uneindtgiyyojbnxolbowqybohbobten ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018220.3966594-461-252696344758072/AnsiballZ_command.py'
Jan 21 17:57:01 compute-0 sudo[110220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:01 compute-0 python3.9[110222]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:57:01 compute-0 sudo[110220]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:01 compute-0 sudo[110373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeyumiyoyupujndkdautatdvxetkvcsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018221.657072-461-155903794398466/AnsiballZ_command.py'
Jan 21 17:57:01 compute-0 sudo[110373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:02 compute-0 python3.9[110375]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:57:02 compute-0 sudo[110373]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:02 compute-0 sudo[110526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxzdgfrrzkcqqnkylzxfmfgmossrrmrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018222.2788124-461-202525493035921/AnsiballZ_command.py'
Jan 21 17:57:02 compute-0 sudo[110526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:02 compute-0 python3.9[110528]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:57:02 compute-0 sudo[110526]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:03 compute-0 sudo[110679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukfxprtgblnugygfhzulhvkhtfeiitsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018222.873289-461-258004948954362/AnsiballZ_command.py'
Jan 21 17:57:03 compute-0 sudo[110679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:03 compute-0 python3.9[110681]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:57:03 compute-0 sudo[110679]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:03 compute-0 sudo[110832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukmmyobjecyprnqpwndikmumolrrqzvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018223.4408827-461-191331670788718/AnsiballZ_command.py'
Jan 21 17:57:03 compute-0 sudo[110832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:03 compute-0 python3.9[110834]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:57:03 compute-0 sudo[110832]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:04 compute-0 sudo[110985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwfajqfardwrtkmkegtfwsfihhwbokkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018224.3367147-569-242050127738153/AnsiballZ_getent.py'
Jan 21 17:57:04 compute-0 sudo[110985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:04 compute-0 python3.9[110987]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 21 17:57:04 compute-0 sudo[110985]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:05 compute-0 sudo[111138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwldsrylktrldabaveppbrsigpjgqygr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018225.162369-585-202908297856089/AnsiballZ_group.py'
Jan 21 17:57:05 compute-0 sudo[111138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:05 compute-0 python3.9[111140]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 17:57:05 compute-0 groupadd[111141]: group added to /etc/group: name=libvirt, GID=42473
Jan 21 17:57:05 compute-0 groupadd[111141]: group added to /etc/gshadow: name=libvirt
Jan 21 17:57:05 compute-0 groupadd[111141]: new group: name=libvirt, GID=42473
Jan 21 17:57:05 compute-0 sudo[111138]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:06 compute-0 sudo[111296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehaosvnwrjtnssclnpqwemwrkbosqozn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018226.0887086-601-6096879480903/AnsiballZ_user.py'
Jan 21 17:57:06 compute-0 sudo[111296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:06 compute-0 python3.9[111298]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 17:57:06 compute-0 useradd[111300]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 17:57:06 compute-0 sudo[111296]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:07 compute-0 sudo[111456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyrpeomuhvtunbtozuftwnblytrjcwdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018227.354686-623-76170152269457/AnsiballZ_setup.py'
Jan 21 17:57:07 compute-0 sudo[111456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:07 compute-0 python3.9[111458]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 17:57:08 compute-0 sudo[111456]: pam_unix(sudo:session): session closed for user root
Jan 21 17:57:08 compute-0 sudo[111540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eykazvrlkgnmgrmtnrbytwelutnivbea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018227.354686-623-76170152269457/AnsiballZ_dnf.py'
Jan 21 17:57:08 compute-0 sudo[111540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 17:57:08 compute-0 python3.9[111542]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 17:57:18 compute-0 podman[111595]: 2026-01-21 17:57:18.223328679 +0000 UTC m=+0.098631777 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 17:57:18 compute-0 podman[111660]: 2026-01-21 17:57:18.997380707 +0000 UTC m=+0.054199271 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 17:57:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:57:20.049 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 17:57:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:57:20.050 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 17:57:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:57:20.050 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 17:57:41 compute-0 kernel: SELinux:  Converting 2763 SID table entries...
Jan 21 17:57:41 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:57:41 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 17:57:41 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:57:41 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:57:41 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:57:41 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:57:41 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:57:48 compute-0 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 21 17:57:49 compute-0 podman[111786]: 2026-01-21 17:57:49.052473677 +0000 UTC m=+0.087340825 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 17:57:49 compute-0 podman[111812]: 2026-01-21 17:57:49.120374735 +0000 UTC m=+0.044242732 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 21 17:57:51 compute-0 kernel: SELinux:  Converting 2763 SID table entries...
Jan 21 17:57:51 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:57:51 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 17:57:51 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:57:51 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:57:51 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:57:51 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:57:51 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:58:09 compute-0 sshd-session[114911]: Invalid user ansible from 64.227.98.100 port 54130
Jan 21 17:58:09 compute-0 sshd-session[114911]: Connection closed by invalid user ansible 64.227.98.100 port 54130 [preauth]
Jan 21 17:58:19 compute-0 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 21 17:58:20 compute-0 podman[122046]: 2026-01-21 17:58:20.028360467 +0000 UTC m=+0.063612946 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 17:58:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:58:20.050 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 17:58:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:58:20.051 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 17:58:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:58:20.051 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 17:58:20 compute-0 podman[122035]: 2026-01-21 17:58:20.062360183 +0000 UTC m=+0.097955800 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 17:58:45 compute-0 kernel: SELinux:  Converting 2764 SID table entries...
Jan 21 17:58:45 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:58:45 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 17:58:45 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:58:45 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:58:45 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:58:45 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:58:45 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:58:46 compute-0 groupadd[128774]: group added to /etc/group: name=dnsmasq, GID=993
Jan 21 17:58:46 compute-0 groupadd[128774]: group added to /etc/gshadow: name=dnsmasq
Jan 21 17:58:46 compute-0 groupadd[128774]: new group: name=dnsmasq, GID=993
Jan 21 17:58:46 compute-0 useradd[128781]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 21 17:58:46 compute-0 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Jan 21 17:58:46 compute-0 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 21 17:58:46 compute-0 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Jan 21 17:58:47 compute-0 groupadd[128794]: group added to /etc/group: name=clevis, GID=992
Jan 21 17:58:47 compute-0 groupadd[128794]: group added to /etc/gshadow: name=clevis
Jan 21 17:58:47 compute-0 groupadd[128794]: new group: name=clevis, GID=992
Jan 21 17:58:47 compute-0 useradd[128801]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 21 17:58:47 compute-0 usermod[128811]: add 'clevis' to group 'tss'
Jan 21 17:58:47 compute-0 usermod[128811]: add 'clevis' to shadow group 'tss'
Jan 21 17:58:50 compute-0 podman[128840]: 2026-01-21 17:58:50.360653886 +0000 UTC m=+0.067227733 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 17:58:50 compute-0 polkitd[43645]: Reloading rules
Jan 21 17:58:50 compute-0 polkitd[43645]: Collecting garbage unconditionally...
Jan 21 17:58:50 compute-0 polkitd[43645]: Loading rules from directory /etc/polkit-1/rules.d
Jan 21 17:58:50 compute-0 polkitd[43645]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 21 17:58:50 compute-0 polkitd[43645]: Finished loading, compiling and executing 3 rules
Jan 21 17:58:50 compute-0 polkitd[43645]: Reloading rules
Jan 21 17:58:50 compute-0 polkitd[43645]: Collecting garbage unconditionally...
Jan 21 17:58:50 compute-0 polkitd[43645]: Loading rules from directory /etc/polkit-1/rules.d
Jan 21 17:58:50 compute-0 polkitd[43645]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 21 17:58:50 compute-0 polkitd[43645]: Finished loading, compiling and executing 3 rules
Jan 21 17:58:50 compute-0 podman[128839]: 2026-01-21 17:58:50.415549942 +0000 UTC m=+0.122557069 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 21 17:58:51 compute-0 groupadd[129046]: group added to /etc/group: name=ceph, GID=167
Jan 21 17:58:51 compute-0 groupadd[129046]: group added to /etc/gshadow: name=ceph
Jan 21 17:58:51 compute-0 groupadd[129046]: new group: name=ceph, GID=167
Jan 21 17:58:51 compute-0 useradd[129052]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 21 17:58:55 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Jan 21 17:58:55 compute-0 sshd[1003]: Received signal 15; terminating.
Jan 21 17:58:55 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Jan 21 17:58:55 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Jan 21 17:58:55 compute-0 systemd[1]: sshd.service: Consumed 1.950s CPU time, read 564.0K from disk, written 40.0K to disk.
Jan 21 17:58:55 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Jan 21 17:58:55 compute-0 systemd[1]: Stopping sshd-keygen.target...
Jan 21 17:58:55 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 17:58:55 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 17:58:55 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 17:58:55 compute-0 systemd[1]: Reached target sshd-keygen.target.
Jan 21 17:58:55 compute-0 systemd[1]: Starting OpenSSH server daemon...
Jan 21 17:58:55 compute-0 sshd[129571]: Server listening on 0.0.0.0 port 22.
Jan 21 17:58:55 compute-0 sshd[129571]: Server listening on :: port 22.
Jan 21 17:58:55 compute-0 systemd[1]: Started OpenSSH server daemon.
Jan 21 17:58:56 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 17:58:56 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 17:58:56 compute-0 systemd[1]: Reloading.
Jan 21 17:58:56 compute-0 systemd-rc-local-generator[129828]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:58:56 compute-0 systemd-sysv-generator[129831]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 17:58:57 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 17:59:00 compute-0 sudo[111540]: pam_unix(sudo:session): session closed for user root
Jan 21 17:59:06 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 17:59:06 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 17:59:06 compute-0 systemd[1]: man-db-cache-update.service: Consumed 11.481s CPU time.
Jan 21 17:59:06 compute-0 systemd[1]: run-r1637a2733bc2467fb32c074fd7bb3385.service: Deactivated successfully.
Jan 21 17:59:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:59:20.052 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 17:59:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:59:20.054 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 17:59:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 17:59:20.054 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 17:59:21 compute-0 podman[138232]: 2026-01-21 17:59:21.00483937 +0000 UTC m=+0.058580511 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 17:59:21 compute-0 podman[138231]: 2026-01-21 17:59:21.033793671 +0000 UTC m=+0.087606904 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 17:59:51 compute-0 podman[138275]: 2026-01-21 17:59:51.992724702 +0000 UTC m=+0.049017779 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 17:59:52 compute-0 podman[138274]: 2026-01-21 17:59:52.012364128 +0000 UTC m=+0.073112894 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:00:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:00:20.053 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:00:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:00:20.054 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:00:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:00:20.054 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:00:20 compute-0 sshd-session[105245]: Received disconnect from 192.168.122.30 port 49994:11: disconnected by user
Jan 21 18:00:20 compute-0 sshd-session[105245]: Disconnected from user zuul 192.168.122.30 port 49994
Jan 21 18:00:20 compute-0 sshd-session[105242]: pam_unix(sshd:session): session closed for user zuul
Jan 21 18:00:20 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Jan 21 18:00:20 compute-0 systemd[1]: session-23.scope: Consumed 2min 2.468s CPU time.
Jan 21 18:00:20 compute-0 systemd-logind[782]: Session 23 logged out. Waiting for processes to exit.
Jan 21 18:00:20 compute-0 systemd-logind[782]: Removed session 23.
Jan 21 18:00:22 compute-0 sshd-session[138319]: Invalid user parity from 64.227.98.100 port 44326
Jan 21 18:00:22 compute-0 sshd-session[138319]: Connection closed by invalid user parity 64.227.98.100 port 44326 [preauth]
Jan 21 18:00:22 compute-0 podman[138322]: 2026-01-21 18:00:22.422573335 +0000 UTC m=+0.063639939 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:00:22 compute-0 podman[138321]: 2026-01-21 18:00:22.440654996 +0000 UTC m=+0.093899941 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:00:23 compute-0 sshd-session[138366]: Accepted publickey for zuul from 192.168.122.30 port 45328 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 18:00:23 compute-0 systemd-logind[782]: New session 24 of user zuul.
Jan 21 18:00:23 compute-0 systemd[1]: Started Session 24 of User zuul.
Jan 21 18:00:23 compute-0 sshd-session[138366]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 18:00:24 compute-0 sudo[138495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjhnxyklhpycskisrgtpkvrngutsoszp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018423.8418672-647-110347311463787/AnsiballZ_systemd.py'
Jan 21 18:00:24 compute-0 sudo[138495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:24 compute-0 python3.9[138497]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:00:24 compute-0 systemd[1]: Reloading.
Jan 21 18:00:24 compute-0 systemd-rc-local-generator[138524]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:00:24 compute-0 systemd-sysv-generator[138529]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:00:25 compute-0 sudo[138495]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:26 compute-0 sudo[138684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojpxnkcdotoworiisypmhwqbmkwzhafn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018425.9665723-647-22852801666342/AnsiballZ_systemd.py'
Jan 21 18:00:26 compute-0 sudo[138684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:26 compute-0 python3.9[138686]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:00:26 compute-0 systemd[1]: Reloading.
Jan 21 18:00:26 compute-0 systemd-sysv-generator[138720]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:00:26 compute-0 systemd-rc-local-generator[138716]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:00:26 compute-0 sudo[138684]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:27 compute-0 sudo[138874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjwjaurkxwdldiyqtbelswkbveynwnxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018427.0827494-647-107654887160930/AnsiballZ_systemd.py'
Jan 21 18:00:27 compute-0 sudo[138874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:27 compute-0 python3.9[138876]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:00:27 compute-0 systemd[1]: Reloading.
Jan 21 18:00:27 compute-0 systemd-rc-local-generator[138907]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:00:27 compute-0 systemd-sysv-generator[138910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:00:28 compute-0 sudo[138874]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:28 compute-0 sudo[139065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiiproxwabwudjneowjeurduzqmbdhtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018428.2036119-647-169742591083105/AnsiballZ_systemd.py'
Jan 21 18:00:28 compute-0 sudo[139065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:28 compute-0 python3.9[139067]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:00:28 compute-0 systemd[1]: Reloading.
Jan 21 18:00:28 compute-0 systemd-rc-local-generator[139095]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:00:28 compute-0 systemd-sysv-generator[139099]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:00:29 compute-0 sudo[139065]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:29 compute-0 sudo[139254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryktkpkeyygvyfvgmtowtoewjaqofvui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018429.6677814-708-4573092313997/AnsiballZ_systemd.py'
Jan 21 18:00:29 compute-0 sudo[139254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:30 compute-0 python3.9[139256]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:30 compute-0 systemd[1]: Reloading.
Jan 21 18:00:30 compute-0 systemd-rc-local-generator[139287]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:00:30 compute-0 systemd-sysv-generator[139291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:00:30 compute-0 sudo[139254]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:30 compute-0 sudo[139444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiouapggqvahrpldugugfraozghapqdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018430.6898444-708-219997475417253/AnsiballZ_systemd.py'
Jan 21 18:00:30 compute-0 sudo[139444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:31 compute-0 python3.9[139446]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:31 compute-0 systemd[1]: Reloading.
Jan 21 18:00:31 compute-0 systemd-rc-local-generator[139479]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:00:31 compute-0 systemd-sysv-generator[139483]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:00:31 compute-0 sudo[139444]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:32 compute-0 sudo[139634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itxzwwckaxhvntmmboyrtonumxgbmlha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018431.778029-708-266407373900420/AnsiballZ_systemd.py'
Jan 21 18:00:32 compute-0 sudo[139634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:32 compute-0 python3.9[139636]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:32 compute-0 systemd[1]: Reloading.
Jan 21 18:00:32 compute-0 systemd-rc-local-generator[139666]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:00:32 compute-0 systemd-sysv-generator[139670]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:00:32 compute-0 sudo[139634]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:33 compute-0 sudo[139823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xghirqxrfqnhmsudkvjfuhyogrxcqbtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018432.9955192-708-22445661435252/AnsiballZ_systemd.py'
Jan 21 18:00:33 compute-0 sudo[139823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:33 compute-0 python3.9[139825]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:33 compute-0 sudo[139823]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:34 compute-0 sudo[139978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mztthdkjtuuguvasvsjuhqrrvsurhkhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018433.7638173-708-216184153825521/AnsiballZ_systemd.py'
Jan 21 18:00:34 compute-0 sudo[139978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:34 compute-0 python3.9[139980]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:34 compute-0 systemd[1]: Reloading.
Jan 21 18:00:34 compute-0 systemd-rc-local-generator[140009]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:00:34 compute-0 systemd-sysv-generator[140015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:00:37 compute-0 sudo[139978]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:38 compute-0 sudo[140168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qotxgieloxellywicmkntaqsbyoqvfwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018437.7734485-780-222737743299821/AnsiballZ_systemd.py'
Jan 21 18:00:38 compute-0 sudo[140168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:38 compute-0 python3.9[140170]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:00:38 compute-0 systemd[1]: Reloading.
Jan 21 18:00:38 compute-0 systemd-rc-local-generator[140197]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:00:38 compute-0 systemd-sysv-generator[140200]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:00:38 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 21 18:00:38 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 21 18:00:38 compute-0 sudo[140168]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:39 compute-0 sudo[140361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmqgsnralysqtmavaggbfxacincfdijb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018439.2574694-796-184335376329698/AnsiballZ_systemd.py'
Jan 21 18:00:39 compute-0 sudo[140361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:39 compute-0 python3.9[140363]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:39 compute-0 sudo[140361]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:40 compute-0 sudo[140516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umdqferjupcuyasqcfqdksdwfzpwskum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018440.0156877-796-76552125954094/AnsiballZ_systemd.py'
Jan 21 18:00:40 compute-0 sudo[140516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:40 compute-0 python3.9[140518]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:40 compute-0 sudo[140516]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:41 compute-0 sudo[140671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbwhmuemjmafmquoimvyvrtovlnohtxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018440.783809-796-120282561285050/AnsiballZ_systemd.py'
Jan 21 18:00:41 compute-0 sudo[140671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:41 compute-0 python3.9[140673]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:41 compute-0 sudo[140671]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:41 compute-0 sudo[140826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbwozykbqdpsdsiyieyrdymeaiijrmaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018441.6861827-796-187011715612482/AnsiballZ_systemd.py'
Jan 21 18:00:41 compute-0 sudo[140826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:42 compute-0 python3.9[140828]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:42 compute-0 sudo[140826]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:42 compute-0 sudo[140981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzysneqmrvlbhbzpgbxwgpwvhyqchcye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018442.4737327-796-25980796456008/AnsiballZ_systemd.py'
Jan 21 18:00:42 compute-0 sudo[140981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:43 compute-0 python3.9[140983]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:43 compute-0 sudo[140981]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:43 compute-0 sudo[141136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prwhaphijngtqnjerrzuqmjmtkswebru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018443.2757092-796-19213395320154/AnsiballZ_systemd.py'
Jan 21 18:00:43 compute-0 sudo[141136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:43 compute-0 python3.9[141138]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:43 compute-0 sudo[141136]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:44 compute-0 sudo[141291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnuexxytybqbknmfjoeetpndoopwldaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018444.0662086-796-278772711581402/AnsiballZ_systemd.py'
Jan 21 18:00:44 compute-0 sudo[141291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:44 compute-0 python3.9[141293]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:44 compute-0 sudo[141291]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:45 compute-0 sudo[141446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxaqqbgsxhgvrxjfyaqqqtlwfvzridox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018444.8251088-796-230483666676241/AnsiballZ_systemd.py'
Jan 21 18:00:45 compute-0 sudo[141446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:45 compute-0 python3.9[141448]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:46 compute-0 sudo[141446]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:47 compute-0 sudo[141601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueakshlxioknrszrcfrqfjnpatqpnxiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018446.6116407-796-24625662859704/AnsiballZ_systemd.py'
Jan 21 18:00:47 compute-0 sudo[141601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:47 compute-0 python3.9[141603]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:47 compute-0 sudo[141601]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:47 compute-0 sshd-session[141604]: Invalid user admin from 2.57.121.112 port 38513
Jan 21 18:00:47 compute-0 sudo[141758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqnpfujrvvmpyxicnknetiuchfjlnfof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018447.6133611-796-261643958949245/AnsiballZ_systemd.py'
Jan 21 18:00:47 compute-0 sudo[141758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:48 compute-0 python3.9[141760]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:48 compute-0 sudo[141758]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:48 compute-0 sshd-session[141604]: Received disconnect from 2.57.121.112 port 38513:11: Bye [preauth]
Jan 21 18:00:48 compute-0 sshd-session[141604]: Disconnected from invalid user admin 2.57.121.112 port 38513 [preauth]
Jan 21 18:00:48 compute-0 sudo[141913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osglwjaatluduxppqnxgjzvasofigtbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018448.4764857-796-14284606515931/AnsiballZ_systemd.py'
Jan 21 18:00:48 compute-0 sudo[141913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:49 compute-0 python3.9[141915]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:49 compute-0 sudo[141913]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:49 compute-0 sudo[142068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dijfpbwwxgilenwlyblwucsxhyubhsvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018449.2718494-796-200500844087798/AnsiballZ_systemd.py'
Jan 21 18:00:49 compute-0 sudo[142068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:49 compute-0 python3.9[142070]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:49 compute-0 sudo[142068]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:50 compute-0 sudo[142223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqltnfzldvulxuxjqcytlvqsgagmlrpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018450.1323025-796-230357433077244/AnsiballZ_systemd.py'
Jan 21 18:00:50 compute-0 sudo[142223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:50 compute-0 python3.9[142225]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:50 compute-0 sudo[142223]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:51 compute-0 sudo[142378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ampquecmyqjtkpyusfpgvqxnlfxcrdyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018450.8435369-796-246146517994176/AnsiballZ_systemd.py'
Jan 21 18:00:51 compute-0 sudo[142378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:51 compute-0 python3.9[142380]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:00:51 compute-0 sudo[142378]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:53 compute-0 podman[142409]: 2026-01-21 18:00:53.013597307 +0000 UTC m=+0.058871316 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 18:00:53 compute-0 podman[142408]: 2026-01-21 18:00:53.072857131 +0000 UTC m=+0.118124540 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 18:00:53 compute-0 sudo[142578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyjoacvxvetargnpcrpsfmupotqlzvwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018453.3117802-1000-183283124168223/AnsiballZ_file.py'
Jan 21 18:00:53 compute-0 sudo[142578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:53 compute-0 python3.9[142580]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:00:53 compute-0 sudo[142578]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:54 compute-0 sudo[142730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efiuofzgsdzziaqicdlxwkwakszjikze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018453.9056413-1000-189388713573387/AnsiballZ_file.py'
Jan 21 18:00:54 compute-0 sudo[142730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:54 compute-0 python3.9[142732]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:00:54 compute-0 sudo[142730]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:54 compute-0 sudo[142882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sultppqdcnjpoqsdbukadrdaasydxmxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018454.5765963-1000-85813341013045/AnsiballZ_file.py'
Jan 21 18:00:54 compute-0 sudo[142882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:54 compute-0 python3.9[142884]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:00:55 compute-0 sudo[142882]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:55 compute-0 sudo[143034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsczhbdawvywgkuaiqzlofeempovfguh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018455.139497-1000-201275382731593/AnsiballZ_file.py'
Jan 21 18:00:55 compute-0 sudo[143034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:55 compute-0 python3.9[143036]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:00:55 compute-0 sudo[143034]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:55 compute-0 sudo[143186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svqybgdfsjldmappepuvvperujpdkcvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018455.6911385-1000-231230888104150/AnsiballZ_file.py'
Jan 21 18:00:55 compute-0 sudo[143186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:56 compute-0 python3.9[143188]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:00:56 compute-0 sudo[143186]: pam_unix(sudo:session): session closed for user root
Jan 21 18:00:56 compute-0 sudo[143338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajnwfnwcylcnawzcqjmzjqhbdpqkoyey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018456.2922685-1000-187317552944974/AnsiballZ_file.py'
Jan 21 18:00:56 compute-0 sudo[143338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:00:56 compute-0 python3.9[143340]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:00:56 compute-0 sudo[143338]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:00 compute-0 python3.9[143490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:01:00 compute-0 sudo[143640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcciwexifiztadcvufxscylpshfxdazd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018460.5138483-1102-195216593766144/AnsiballZ_stat.py'
Jan 21 18:01:00 compute-0 sudo[143640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:01 compute-0 python3.9[143642]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:01 compute-0 sudo[143640]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:01 compute-0 CROND[143693]: (root) CMD (run-parts /etc/cron.hourly)
Jan 21 18:01:01 compute-0 run-parts[143697]: (/etc/cron.hourly) starting 0anacron
Jan 21 18:01:01 compute-0 anacron[143712]: Anacron started on 2026-01-21
Jan 21 18:01:01 compute-0 anacron[143712]: Will run job `cron.daily' in 5 min.
Jan 21 18:01:01 compute-0 anacron[143712]: Will run job `cron.weekly' in 25 min.
Jan 21 18:01:01 compute-0 anacron[143712]: Will run job `cron.monthly' in 45 min.
Jan 21 18:01:01 compute-0 anacron[143712]: Jobs will be executed sequentially
Jan 21 18:01:01 compute-0 run-parts[143718]: (/etc/cron.hourly) finished 0anacron
Jan 21 18:01:01 compute-0 CROND[143692]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 21 18:01:01 compute-0 sudo[143780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fovdzdeykbxuvmtwwkwxqhjozxbuztob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018460.5138483-1102-195216593766144/AnsiballZ_copy.py'
Jan 21 18:01:01 compute-0 sudo[143780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:01 compute-0 python3.9[143782]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769018460.5138483-1102-195216593766144/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:01 compute-0 sudo[143780]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:02 compute-0 sudo[143932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugfaglktapllzawnxwkuubbglfibigty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018462.0245612-1102-178590323964705/AnsiballZ_stat.py'
Jan 21 18:01:02 compute-0 sudo[143932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:02 compute-0 python3.9[143934]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:02 compute-0 sudo[143932]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:02 compute-0 sudo[144057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvdttiuhtsyzviczaesiywtnibvamqxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018462.0245612-1102-178590323964705/AnsiballZ_copy.py'
Jan 21 18:01:02 compute-0 sudo[144057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:03 compute-0 python3.9[144059]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769018462.0245612-1102-178590323964705/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:03 compute-0 sudo[144057]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:03 compute-0 sudo[144209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xghpqksrtzvbeuihbeosggpdrtjtjbnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018463.2563899-1102-48820104423516/AnsiballZ_stat.py'
Jan 21 18:01:03 compute-0 sudo[144209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:03 compute-0 python3.9[144211]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:03 compute-0 sudo[144209]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:04 compute-0 sudo[144334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjlesrzvfgrzuqetzzigapjzesqkefup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018463.2563899-1102-48820104423516/AnsiballZ_copy.py'
Jan 21 18:01:04 compute-0 sudo[144334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:04 compute-0 python3.9[144336]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769018463.2563899-1102-48820104423516/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:04 compute-0 sudo[144334]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:04 compute-0 sudo[144486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zivgfbzmxymdsbzxayxtqqloujirjihp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018464.520864-1102-147296870330804/AnsiballZ_stat.py'
Jan 21 18:01:04 compute-0 sudo[144486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:04 compute-0 python3.9[144488]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:05 compute-0 sudo[144486]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:05 compute-0 sudo[144611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylddeavvlrbtkheymvtcnipkqzjgywpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018464.520864-1102-147296870330804/AnsiballZ_copy.py'
Jan 21 18:01:05 compute-0 sudo[144611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:05 compute-0 python3.9[144613]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769018464.520864-1102-147296870330804/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:05 compute-0 sudo[144611]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:06 compute-0 sudo[144763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzvmmtwnhcfcesmolasqmnudhnrrpvsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018465.7179441-1102-222591476343946/AnsiballZ_stat.py'
Jan 21 18:01:06 compute-0 sudo[144763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:06 compute-0 python3.9[144765]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:06 compute-0 sudo[144763]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:06 compute-0 sudo[144888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtpetighwmieyyexcoifaszcxshtjesn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018465.7179441-1102-222591476343946/AnsiballZ_copy.py'
Jan 21 18:01:06 compute-0 sudo[144888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:06 compute-0 python3.9[144890]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769018465.7179441-1102-222591476343946/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:06 compute-0 sudo[144888]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:07 compute-0 sudo[145040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nibdmvzulqklvglgtzogzidufsqfcqwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018467.0239434-1102-96647090960088/AnsiballZ_stat.py'
Jan 21 18:01:07 compute-0 sudo[145040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:07 compute-0 python3.9[145042]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:07 compute-0 sudo[145040]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:07 compute-0 sudo[145165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsgbihfffflwrkdvoliyttbyyqzoveax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018467.0239434-1102-96647090960088/AnsiballZ_copy.py'
Jan 21 18:01:07 compute-0 sudo[145165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:08 compute-0 python3.9[145167]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769018467.0239434-1102-96647090960088/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:08 compute-0 sudo[145165]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:08 compute-0 sudo[145317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fafojrgioefrgqjaheibaltrqrtrxxyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018468.4081693-1102-115914916420480/AnsiballZ_stat.py'
Jan 21 18:01:08 compute-0 sudo[145317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:08 compute-0 python3.9[145319]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:08 compute-0 sudo[145317]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:09 compute-0 sudo[145440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auyymtioufvhozlnjjsoterodwlvrmcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018468.4081693-1102-115914916420480/AnsiballZ_copy.py'
Jan 21 18:01:09 compute-0 sudo[145440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:09 compute-0 python3.9[145442]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769018468.4081693-1102-115914916420480/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:09 compute-0 sudo[145440]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:09 compute-0 sudo[145592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tayyenytzakzdobmggtsmarbbjjyyamy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018469.591486-1102-63283592175722/AnsiballZ_stat.py'
Jan 21 18:01:09 compute-0 sudo[145592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:10 compute-0 python3.9[145594]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:10 compute-0 sudo[145592]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:10 compute-0 sudo[145717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgyhayqlcajsdvsgmxytqmonytrvdput ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018469.591486-1102-63283592175722/AnsiballZ_copy.py'
Jan 21 18:01:10 compute-0 sudo[145717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:10 compute-0 python3.9[145719]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769018469.591486-1102-63283592175722/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:10 compute-0 sudo[145717]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:11 compute-0 sudo[145869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzciqjddtywqvcmaezzqswdorazgyrln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018471.0809584-1328-280286339037035/AnsiballZ_command.py'
Jan 21 18:01:11 compute-0 sudo[145869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:11 compute-0 python3.9[145871]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 21 18:01:11 compute-0 sudo[145869]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:12 compute-0 sudo[146022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayooylzjlwjismhblvsikwqzjedqppql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018471.812695-1346-217338760360339/AnsiballZ_file.py'
Jan 21 18:01:12 compute-0 sudo[146022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:12 compute-0 python3.9[146024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:12 compute-0 sudo[146022]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:12 compute-0 sudo[146174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbohznnjpdaonqearpseqlyqgkphxayu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018472.4753878-1346-28962240100632/AnsiballZ_file.py'
Jan 21 18:01:12 compute-0 sudo[146174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:12 compute-0 python3.9[146176]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:12 compute-0 sudo[146174]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:13 compute-0 sudo[146326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkfptjzdtdmpznhsgpnnrlqjftcqgtbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018473.0135903-1346-125406023437069/AnsiballZ_file.py'
Jan 21 18:01:13 compute-0 sudo[146326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:13 compute-0 python3.9[146328]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:13 compute-0 sudo[146326]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:13 compute-0 sudo[146478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zywckelnosdlhlcgjymmcajksphinbss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018473.6235807-1346-204272010791910/AnsiballZ_file.py'
Jan 21 18:01:13 compute-0 sudo[146478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:14 compute-0 python3.9[146480]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:14 compute-0 sudo[146478]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:14 compute-0 sudo[146630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksdfegihgaeemteeabpshrcqaqlmahly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018474.2340894-1346-59532176628813/AnsiballZ_file.py'
Jan 21 18:01:14 compute-0 sudo[146630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:14 compute-0 python3.9[146632]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:14 compute-0 sudo[146630]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:15 compute-0 sudo[146782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohsbfffzgqaqjtdudbgxgfqvyzhbtxti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018474.8482482-1346-250301993900193/AnsiballZ_file.py'
Jan 21 18:01:15 compute-0 sudo[146782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:15 compute-0 python3.9[146784]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:15 compute-0 sudo[146782]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:15 compute-0 sudo[146934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zstticvydnlwljlhkpeutrvmygckawdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018475.4666169-1346-175453590664603/AnsiballZ_file.py'
Jan 21 18:01:15 compute-0 sudo[146934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:15 compute-0 python3.9[146936]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:15 compute-0 sudo[146934]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:16 compute-0 sudo[147086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gealqoywsmkzreuktzjkdoqlrkonqlya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018476.0619185-1346-5308639732352/AnsiballZ_file.py'
Jan 21 18:01:16 compute-0 sudo[147086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:16 compute-0 python3.9[147088]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:16 compute-0 sudo[147086]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:17 compute-0 sudo[147238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vackobxhegqrhgahutwwzpmhptdbtbpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018476.6884375-1346-156864020670035/AnsiballZ_file.py'
Jan 21 18:01:17 compute-0 sudo[147238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:17 compute-0 python3.9[147240]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:17 compute-0 sudo[147238]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:17 compute-0 sudo[147390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kswjfloforkddhrbrlvpkqaihodrsbbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018477.3803654-1346-203094581743954/AnsiballZ_file.py'
Jan 21 18:01:17 compute-0 sudo[147390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:17 compute-0 python3.9[147392]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:17 compute-0 sudo[147390]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:18 compute-0 sudo[147542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fudnhsgwmwwlcrrnmiodbkdthyweyqwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018477.9765725-1346-11078955344702/AnsiballZ_file.py'
Jan 21 18:01:18 compute-0 sudo[147542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:18 compute-0 python3.9[147544]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:18 compute-0 sudo[147542]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:18 compute-0 sudo[147694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qewmpfkfvyupwaweczctsdinevcjdviq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018478.5812516-1346-141490819195673/AnsiballZ_file.py'
Jan 21 18:01:18 compute-0 sudo[147694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:19 compute-0 python3.9[147696]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:19 compute-0 sudo[147694]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:19 compute-0 sudo[147846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kynmsrxjvustrqybkrxoanobjriiomsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018479.1595683-1346-268347677745404/AnsiballZ_file.py'
Jan 21 18:01:19 compute-0 sudo[147846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:19 compute-0 python3.9[147848]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:19 compute-0 sudo[147846]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:01:20.055 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:01:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:01:20.056 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:01:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:01:20.056 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:01:20 compute-0 sudo[147998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghmfvylaxzzuigqlnvrfhrlhkpnpvgik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018479.8119214-1346-11052474156359/AnsiballZ_file.py'
Jan 21 18:01:20 compute-0 sudo[147998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:20 compute-0 python3.9[148000]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:20 compute-0 sudo[147998]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:21 compute-0 sudo[148150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jghmndazqbuhidecwnepikipfdmlmyvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018480.939425-1544-242038925370179/AnsiballZ_stat.py'
Jan 21 18:01:21 compute-0 sudo[148150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:21 compute-0 python3.9[148152]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:21 compute-0 sudo[148150]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:21 compute-0 sudo[148273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvirtfyroigkvztdkmxirjzvwmukiyol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018480.939425-1544-242038925370179/AnsiballZ_copy.py'
Jan 21 18:01:21 compute-0 sudo[148273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:21 compute-0 python3.9[148275]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018480.939425-1544-242038925370179/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:21 compute-0 sudo[148273]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:22 compute-0 sudo[148425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flknlbwhwpnjomhaindswiiftpvjombu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018482.0906682-1544-49811185291813/AnsiballZ_stat.py'
Jan 21 18:01:22 compute-0 sudo[148425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:22 compute-0 python3.9[148427]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:22 compute-0 sudo[148425]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:23 compute-0 sudo[148558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsevdlpfzsqfynlfaluhysddcxnynazf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018482.0906682-1544-49811185291813/AnsiballZ_copy.py'
Jan 21 18:01:23 compute-0 sudo[148558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:23 compute-0 podman[148522]: 2026-01-21 18:01:23.159377235 +0000 UTC m=+0.096195195 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:01:23 compute-0 podman[148567]: 2026-01-21 18:01:23.222733493 +0000 UTC m=+0.089670661 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 21 18:01:23 compute-0 python3.9[148570]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018482.0906682-1544-49811185291813/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:23 compute-0 sudo[148558]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:23 compute-0 sudo[148746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwtouwdouctpnkaaawpbyjrjnkpdazlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018483.4892461-1544-41589400868361/AnsiballZ_stat.py'
Jan 21 18:01:23 compute-0 sudo[148746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:23 compute-0 python3.9[148748]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:23 compute-0 sudo[148746]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:24 compute-0 sudo[148869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekopjostjrvsdcmpggxzmfqsqsmdhhtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018483.4892461-1544-41589400868361/AnsiballZ_copy.py'
Jan 21 18:01:24 compute-0 sudo[148869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:24 compute-0 python3.9[148871]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018483.4892461-1544-41589400868361/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:24 compute-0 sudo[148869]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:24 compute-0 sudo[149021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsxxyxrtywkxhcdzccbvyagltwxuzooe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018484.7098184-1544-9805491548082/AnsiballZ_stat.py'
Jan 21 18:01:24 compute-0 sudo[149021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:25 compute-0 python3.9[149023]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:25 compute-0 sudo[149021]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:25 compute-0 sudo[149144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkjqmxjumsvbhqaddyzyewbgxqdqqvns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018484.7098184-1544-9805491548082/AnsiballZ_copy.py'
Jan 21 18:01:25 compute-0 sudo[149144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:25 compute-0 python3.9[149146]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018484.7098184-1544-9805491548082/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:25 compute-0 sudo[149144]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:26 compute-0 sudo[149296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyzmrpollvyomckojkegykobjxlttifk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018485.8940382-1544-276231146943875/AnsiballZ_stat.py'
Jan 21 18:01:26 compute-0 sudo[149296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:26 compute-0 python3.9[149298]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:26 compute-0 sudo[149296]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:26 compute-0 sudo[149419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzivebfexoithrzfhmpjczgolbgffvtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018485.8940382-1544-276231146943875/AnsiballZ_copy.py'
Jan 21 18:01:26 compute-0 sudo[149419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:26 compute-0 python3.9[149421]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018485.8940382-1544-276231146943875/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:26 compute-0 sudo[149419]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:27 compute-0 sudo[149571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zokhhvmgycopgfouxuyvzucfsegtypco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018487.0669672-1544-105386784122294/AnsiballZ_stat.py'
Jan 21 18:01:27 compute-0 sudo[149571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:27 compute-0 python3.9[149573]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:27 compute-0 sudo[149571]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:27 compute-0 sudo[149694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbkqgpbvpqyvuvegptlxgsysktslgqhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018487.0669672-1544-105386784122294/AnsiballZ_copy.py'
Jan 21 18:01:27 compute-0 sudo[149694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:28 compute-0 python3.9[149696]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018487.0669672-1544-105386784122294/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:28 compute-0 sudo[149694]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:28 compute-0 sudo[149846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icmwkvexcbcfqnkbjlmhdtfyjwrqpuwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018488.351905-1544-77182764219525/AnsiballZ_stat.py'
Jan 21 18:01:28 compute-0 sudo[149846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:28 compute-0 python3.9[149848]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:28 compute-0 sudo[149846]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:29 compute-0 sudo[149969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbyaltnyxvebcwdpxpkuwpacynncmlmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018488.351905-1544-77182764219525/AnsiballZ_copy.py'
Jan 21 18:01:29 compute-0 sudo[149969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:29 compute-0 python3.9[149971]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018488.351905-1544-77182764219525/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:29 compute-0 sudo[149969]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:29 compute-0 sudo[150121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipzmpzoqegkutzpglyexlsldmsbcnipd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018489.6765294-1544-202632853899475/AnsiballZ_stat.py'
Jan 21 18:01:29 compute-0 sudo[150121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:30 compute-0 python3.9[150123]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:30 compute-0 sudo[150121]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:30 compute-0 sudo[150244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgxfnooaodnzjlyhhaofpurnnoseizbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018489.6765294-1544-202632853899475/AnsiballZ_copy.py'
Jan 21 18:01:30 compute-0 sudo[150244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:30 compute-0 python3.9[150246]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018489.6765294-1544-202632853899475/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:30 compute-0 sudo[150244]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:31 compute-0 sudo[150396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfbauqpltejjamwhozvxlhpmzntwmvit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018490.9020023-1544-132132164004943/AnsiballZ_stat.py'
Jan 21 18:01:31 compute-0 sudo[150396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:31 compute-0 python3.9[150398]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:31 compute-0 sudo[150396]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:31 compute-0 sudo[150519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqyhmzfflcqdjbauufjgjftwqhecajld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018490.9020023-1544-132132164004943/AnsiballZ_copy.py'
Jan 21 18:01:31 compute-0 sudo[150519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:31 compute-0 python3.9[150521]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018490.9020023-1544-132132164004943/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:31 compute-0 sudo[150519]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:32 compute-0 sudo[150671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzwnpnhsbvtyadvqlfvshijktboqcjgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018492.073073-1544-96414618374471/AnsiballZ_stat.py'
Jan 21 18:01:32 compute-0 sudo[150671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:32 compute-0 python3.9[150673]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:32 compute-0 sudo[150671]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:32 compute-0 sudo[150794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noenhxipdcmyfepgwwxghqbgyafgrspn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018492.073073-1544-96414618374471/AnsiballZ_copy.py'
Jan 21 18:01:32 compute-0 sudo[150794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:32 compute-0 python3.9[150796]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018492.073073-1544-96414618374471/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:33 compute-0 sudo[150794]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:33 compute-0 sudo[150946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztfamkscfauoixhhuurupubvkcvdhzyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018493.187075-1544-112599573870865/AnsiballZ_stat.py'
Jan 21 18:01:33 compute-0 sudo[150946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:33 compute-0 python3.9[150948]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:33 compute-0 sudo[150946]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:34 compute-0 sudo[151069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frzdyjrxydseewzvyjutmhoszqtiqsoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018493.187075-1544-112599573870865/AnsiballZ_copy.py'
Jan 21 18:01:34 compute-0 sudo[151069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:34 compute-0 python3.9[151071]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018493.187075-1544-112599573870865/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:34 compute-0 sudo[151069]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:34 compute-0 sudo[151221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stekajiuyxwsouczqubbypmowuysmwcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018494.3579752-1544-199162587249100/AnsiballZ_stat.py'
Jan 21 18:01:34 compute-0 sudo[151221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:34 compute-0 python3.9[151223]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:34 compute-0 sudo[151221]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:35 compute-0 sudo[151344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgpopwvsbpctftotukgscwluwlfnrkzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018494.3579752-1544-199162587249100/AnsiballZ_copy.py'
Jan 21 18:01:35 compute-0 sudo[151344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:35 compute-0 python3.9[151346]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018494.3579752-1544-199162587249100/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:35 compute-0 sudo[151344]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:35 compute-0 sudo[151496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvjzddsfdfjfkqbcprfrfkdqucvngdxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018495.4393582-1544-190512217293645/AnsiballZ_stat.py'
Jan 21 18:01:35 compute-0 sudo[151496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:35 compute-0 python3.9[151498]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:35 compute-0 sudo[151496]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:36 compute-0 sudo[151619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oclnfabaddqmpedoghgmnboqinexlrlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018495.4393582-1544-190512217293645/AnsiballZ_copy.py'
Jan 21 18:01:36 compute-0 sudo[151619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:36 compute-0 python3.9[151621]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018495.4393582-1544-190512217293645/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:36 compute-0 sudo[151619]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:36 compute-0 sudo[151771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypgtxwhvvxthnjdwfhuoizysxrxxahhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018496.6496463-1544-92433544821307/AnsiballZ_stat.py'
Jan 21 18:01:36 compute-0 sudo[151771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:37 compute-0 python3.9[151773]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:01:37 compute-0 sudo[151771]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:37 compute-0 sudo[151894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjwmjkwfcxsssbpursabcrjmbzshniwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018496.6496463-1544-92433544821307/AnsiballZ_copy.py'
Jan 21 18:01:37 compute-0 sudo[151894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:37 compute-0 python3.9[151896]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018496.6496463-1544-92433544821307/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:37 compute-0 sudo[151894]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:41 compute-0 python3.9[152046]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:01:42 compute-0 sudo[152199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjjuydtdaesnnlgoayowyyirqevanbqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018502.2975101-1956-23169538787244/AnsiballZ_seboolean.py'
Jan 21 18:01:42 compute-0 sudo[152199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:42 compute-0 python3.9[152201]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 21 18:01:44 compute-0 sudo[152199]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:50 compute-0 sudo[152355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vemampemsgftzaacwuluufqrakxtahwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018510.0825117-1972-9786815188677/AnsiballZ_copy.py'
Jan 21 18:01:50 compute-0 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 21 18:01:50 compute-0 sudo[152355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:50 compute-0 python3.9[152357]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:50 compute-0 sudo[152355]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:51 compute-0 sudo[152507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekwidkmrzpuwprckntvdcdgsyadlktjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018510.8069324-1972-11044667121052/AnsiballZ_copy.py'
Jan 21 18:01:51 compute-0 sudo[152507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:51 compute-0 python3.9[152509]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:51 compute-0 sudo[152507]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:51 compute-0 sudo[152659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqfrhaflumszyxobsypxntdanaztspwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018511.4279788-1972-141619560000763/AnsiballZ_copy.py'
Jan 21 18:01:51 compute-0 sudo[152659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:51 compute-0 python3.9[152661]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:51 compute-0 sudo[152659]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:52 compute-0 sudo[152811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hijldrwqjjytawqtjiyhszldhynnijlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018512.0250936-1972-55926868136633/AnsiballZ_copy.py'
Jan 21 18:01:52 compute-0 sudo[152811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:52 compute-0 python3.9[152813]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:52 compute-0 sudo[152811]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:52 compute-0 sudo[152963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcmbdpnimhzpjjkfkcrhkbabvctsvpeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018512.5998785-1972-273058411561988/AnsiballZ_copy.py'
Jan 21 18:01:52 compute-0 sudo[152963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:53 compute-0 python3.9[152965]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:53 compute-0 sudo[152963]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:53 compute-0 sudo[153139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozirputtknveqnwjhktomyclfbmxkply ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018513.570416-2044-112514102523420/AnsiballZ_copy.py'
Jan 21 18:01:53 compute-0 sudo[153139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:53 compute-0 podman[153090]: 2026-01-21 18:01:53.89843434 +0000 UTC m=+0.064412694 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:01:53 compute-0 podman[153089]: 2026-01-21 18:01:53.922306805 +0000 UTC m=+0.087956030 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 18:01:54 compute-0 python3.9[153150]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:54 compute-0 sudo[153139]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:54 compute-0 sudo[153312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebgbqbhokgsojkkmhpeluetvmhrwvavl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018514.2112198-2044-242650838155038/AnsiballZ_copy.py'
Jan 21 18:01:54 compute-0 sudo[153312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:54 compute-0 python3.9[153314]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:54 compute-0 sudo[153312]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:55 compute-0 sudo[153464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmkhbvacsjgwvgtlsplwmkaqapifdteo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018514.833424-2044-109926133398409/AnsiballZ_copy.py'
Jan 21 18:01:55 compute-0 sudo[153464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:55 compute-0 python3.9[153466]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:55 compute-0 sudo[153464]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:55 compute-0 sudo[153616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccfpdipfqmvjtgmvdzibbciblgduwdvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018515.412718-2044-149508747787275/AnsiballZ_copy.py'
Jan 21 18:01:55 compute-0 sudo[153616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:55 compute-0 python3.9[153618]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:55 compute-0 sudo[153616]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:56 compute-0 sudo[153768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndgjwwcbyeejblwnrtatoxbxyjfvifgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018515.971578-2044-66415122114105/AnsiballZ_copy.py'
Jan 21 18:01:56 compute-0 sudo[153768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:56 compute-0 python3.9[153770]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:01:56 compute-0 sudo[153768]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:57 compute-0 sudo[153920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upuqxhnnefwuqcjvwxemzmwzuaromkbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018517.2217493-2116-159302308919902/AnsiballZ_systemd.py'
Jan 21 18:01:57 compute-0 sudo[153920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:57 compute-0 python3.9[153922]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:01:57 compute-0 systemd[1]: Reloading.
Jan 21 18:01:57 compute-0 systemd-sysv-generator[153952]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:01:57 compute-0 systemd-rc-local-generator[153948]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:01:58 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Jan 21 18:01:58 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Jan 21 18:01:58 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 21 18:01:58 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 21 18:01:58 compute-0 systemd[1]: Starting libvirt logging daemon...
Jan 21 18:01:58 compute-0 systemd[1]: Started libvirt logging daemon.
Jan 21 18:01:58 compute-0 sudo[153920]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:58 compute-0 sudo[154113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaayaixbolmsrunrrgxxewtztxdibguu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018518.3725863-2116-58302934815444/AnsiballZ_systemd.py'
Jan 21 18:01:58 compute-0 sudo[154113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:01:59 compute-0 python3.9[154115]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:01:59 compute-0 systemd[1]: Reloading.
Jan 21 18:01:59 compute-0 systemd-rc-local-generator[154143]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:01:59 compute-0 systemd-sysv-generator[154146]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:01:59 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 21 18:01:59 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 21 18:01:59 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 21 18:01:59 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 21 18:01:59 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 21 18:01:59 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 21 18:01:59 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 21 18:01:59 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 21 18:01:59 compute-0 sudo[154113]: pam_unix(sudo:session): session closed for user root
Jan 21 18:01:59 compute-0 sudo[154329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goteukpefxyvpwjawkbuelvssqwtbbev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018519.6095486-2116-134459286060842/AnsiballZ_systemd.py'
Jan 21 18:01:59 compute-0 sudo[154329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:00 compute-0 python3.9[154331]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:02:00 compute-0 systemd[1]: Reloading.
Jan 21 18:02:00 compute-0 systemd-rc-local-generator[154357]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:02:00 compute-0 systemd-sysv-generator[154361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:02:00 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 21 18:02:00 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 21 18:02:00 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 21 18:02:00 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 21 18:02:00 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 21 18:02:00 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 21 18:02:00 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 21 18:02:00 compute-0 sudo[154329]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:00 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 21 18:02:00 compute-0 sudo[154541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrtixqbbthisqzweogpktganhqjncnpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018520.6543593-2116-197461992256830/AnsiballZ_systemd.py'
Jan 21 18:02:00 compute-0 sudo[154541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:01 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 21 18:02:01 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 21 18:02:01 compute-0 python3.9[154543]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:02:01 compute-0 systemd[1]: Reloading.
Jan 21 18:02:01 compute-0 systemd-rc-local-generator[154578]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:02:01 compute-0 systemd-sysv-generator[154581]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:02:01 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Jan 21 18:02:01 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 21 18:02:01 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 21 18:02:01 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 21 18:02:01 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 21 18:02:01 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 21 18:02:01 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 21 18:02:01 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 21 18:02:01 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 21 18:02:01 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 21 18:02:01 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 21 18:02:01 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 21 18:02:01 compute-0 sudo[154541]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:01 compute-0 setroubleshoot[154368]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 5899f043-3496-40d1-aaa4-de14291f5082
Jan 21 18:02:01 compute-0 setroubleshoot[154368]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 21 18:02:02 compute-0 setroubleshoot[154368]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 5899f043-3496-40d1-aaa4-de14291f5082
Jan 21 18:02:02 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:02:02 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:02:02 compute-0 sudo[154766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkojdckuuelhacrtbyerrkeveochfuwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018521.749616-2116-126379165092980/AnsiballZ_systemd.py'
Jan 21 18:02:02 compute-0 sudo[154766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:02 compute-0 setroubleshoot[154368]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 21 18:02:02 compute-0 python3.9[154769]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:02:02 compute-0 systemd[1]: Reloading.
Jan 21 18:02:02 compute-0 systemd-sysv-generator[154799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:02:02 compute-0 systemd-rc-local-generator[154795]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:02:02 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Jan 21 18:02:02 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Jan 21 18:02:02 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 21 18:02:02 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 21 18:02:02 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 21 18:02:02 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 21 18:02:02 compute-0 systemd[1]: Starting libvirt secret daemon...
Jan 21 18:02:02 compute-0 systemd[1]: Started libvirt secret daemon.
Jan 21 18:02:02 compute-0 sudo[154766]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:04 compute-0 sudo[154978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftbkrxtlgpfgevegwvalzjttgnqbsode ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018524.5573285-2190-81506032481178/AnsiballZ_file.py'
Jan 21 18:02:04 compute-0 sudo[154978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:05 compute-0 python3.9[154980]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:05 compute-0 sudo[154978]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:05 compute-0 sudo[155130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnieexviytvyabxcluafmjeywptatzux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018525.281991-2206-175477096970538/AnsiballZ_find.py'
Jan 21 18:02:05 compute-0 sudo[155130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:05 compute-0 python3.9[155132]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:02:05 compute-0 sudo[155130]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:06 compute-0 sudo[155282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tstrpvevnpwtuxjogkyxxdlwvaaxvktx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018526.2703934-2234-264814080428559/AnsiballZ_stat.py'
Jan 21 18:02:06 compute-0 sudo[155282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:06 compute-0 python3.9[155284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:06 compute-0 sudo[155282]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:07 compute-0 sudo[155405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhbecuufzttqnvkbhkhwmqjwlmfxxlfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018526.2703934-2234-264814080428559/AnsiballZ_copy.py'
Jan 21 18:02:07 compute-0 sudo[155405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:07 compute-0 python3.9[155407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018526.2703934-2234-264814080428559/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:07 compute-0 sudo[155405]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:08 compute-0 sudo[155557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eufqnqksmiqvfsidwhdhxgmuhzgidaga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018527.73375-2266-13381686964135/AnsiballZ_file.py'
Jan 21 18:02:08 compute-0 sudo[155557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:08 compute-0 python3.9[155559]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:08 compute-0 sudo[155557]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:08 compute-0 sudo[155709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rihjlltntgwijgynbzzpavsprylbznwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018528.438123-2282-8394532796747/AnsiballZ_stat.py'
Jan 21 18:02:08 compute-0 sudo[155709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:08 compute-0 python3.9[155711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:09 compute-0 sudo[155709]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:09 compute-0 sudo[155787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxzakpjtjxvmvyhovzpmpvipehcwgsnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018528.438123-2282-8394532796747/AnsiballZ_file.py'
Jan 21 18:02:09 compute-0 sudo[155787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:09 compute-0 python3.9[155789]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:09 compute-0 sudo[155787]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:09 compute-0 sudo[155939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bihpjgjqckydvwatyrnouwpaglemafmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018529.5805604-2306-79390272110793/AnsiballZ_stat.py'
Jan 21 18:02:09 compute-0 sudo[155939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:10 compute-0 python3.9[155941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:10 compute-0 sudo[155939]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:10 compute-0 sudo[156017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dluywnkcchqvqfghlamlfinelvjgvcyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018529.5805604-2306-79390272110793/AnsiballZ_file.py'
Jan 21 18:02:10 compute-0 sudo[156017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:10 compute-0 python3.9[156019]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.n5smf4f3 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:10 compute-0 sudo[156017]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:10 compute-0 sudo[156169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iydvbvfnauupswihfpzpwaxiegmzmwxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018530.7152238-2330-22796854284753/AnsiballZ_stat.py'
Jan 21 18:02:10 compute-0 sudo[156169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:11 compute-0 python3.9[156171]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:11 compute-0 sudo[156169]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:11 compute-0 sudo[156247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcfbmqtjlhlyxohnnwjhnatkxpoamumx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018530.7152238-2330-22796854284753/AnsiballZ_file.py'
Jan 21 18:02:11 compute-0 sudo[156247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:11 compute-0 python3.9[156249]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:11 compute-0 sudo[156247]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:12 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 21 18:02:12 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 21 18:02:12 compute-0 sudo[156399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqboqpogkydhlsrfxhxvdvzlcjgjbarj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018531.8140805-2356-178569142431873/AnsiballZ_command.py'
Jan 21 18:02:12 compute-0 sudo[156399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:12 compute-0 python3.9[156401]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:02:12 compute-0 sudo[156399]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:13 compute-0 sudo[156552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkfmoxwuondtdinagqlbiumrdwwobzyj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018532.6237278-2372-43333600855747/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 18:02:13 compute-0 sudo[156552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:13 compute-0 python3[156554]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 18:02:13 compute-0 sudo[156552]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:13 compute-0 sudo[156704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfolhcsjguzwtsqvhkqlbaissayanjnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018533.4294748-2388-5584635290116/AnsiballZ_stat.py'
Jan 21 18:02:13 compute-0 sudo[156704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:13 compute-0 python3.9[156706]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:13 compute-0 sudo[156704]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:14 compute-0 sudo[156782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoqathxvdsytxsjoqplkvdfquoheddvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018533.4294748-2388-5584635290116/AnsiballZ_file.py'
Jan 21 18:02:14 compute-0 sudo[156782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:14 compute-0 python3.9[156784]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:14 compute-0 sudo[156782]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:15 compute-0 sudo[156934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpseeqightomtbugqbokcqmpdgrdeddp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018534.9782639-2412-220341177583773/AnsiballZ_stat.py'
Jan 21 18:02:15 compute-0 sudo[156934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:15 compute-0 python3.9[156936]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:15 compute-0 sudo[156934]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:15 compute-0 sudo[157059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efnukoivkozeiienqgxewwooievypcsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018534.9782639-2412-220341177583773/AnsiballZ_copy.py'
Jan 21 18:02:15 compute-0 sudo[157059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:15 compute-0 python3.9[157061]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018534.9782639-2412-220341177583773/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:16 compute-0 sudo[157059]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:16 compute-0 sudo[157211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shfeqvttwkxwnjuvfsjakadtyeadkikl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018536.2880316-2442-238466108920612/AnsiballZ_stat.py'
Jan 21 18:02:16 compute-0 sudo[157211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:16 compute-0 python3.9[157213]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:16 compute-0 sudo[157211]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:17 compute-0 sudo[157289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiqytwuvtmmkbmiyqxqwvfzefqcutjka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018536.2880316-2442-238466108920612/AnsiballZ_file.py'
Jan 21 18:02:17 compute-0 sudo[157289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:17 compute-0 python3.9[157291]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:17 compute-0 sudo[157289]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:17 compute-0 sudo[157441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtqathekbrzlgcvccfljhuvxtsfihzrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018537.4747372-2466-112848205644387/AnsiballZ_stat.py'
Jan 21 18:02:17 compute-0 sudo[157441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:18 compute-0 python3.9[157443]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:18 compute-0 sudo[157441]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:18 compute-0 sudo[157519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkcachvmkpmpmaxupbgrkgkscxeptbzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018537.4747372-2466-112848205644387/AnsiballZ_file.py'
Jan 21 18:02:18 compute-0 sudo[157519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:18 compute-0 python3.9[157521]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:18 compute-0 sudo[157519]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:19 compute-0 sudo[157671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znmetsdmouzpdqgrlrciblowkwffgrru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018538.8505497-2490-20005351833263/AnsiballZ_stat.py'
Jan 21 18:02:19 compute-0 sudo[157671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:19 compute-0 python3.9[157673]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:19 compute-0 sudo[157671]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:19 compute-0 sudo[157796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iolfaygtmpxpdxyrwtvgmvvpxniykkvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018538.8505497-2490-20005351833263/AnsiballZ_copy.py'
Jan 21 18:02:19 compute-0 sudo[157796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:20 compute-0 python3.9[157798]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018538.8505497-2490-20005351833263/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:20 compute-0 sudo[157796]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:02:20.056 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:02:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:02:20.059 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:02:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:02:20.059 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:02:20 compute-0 sudo[157948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmgdbwcijphxyarfmwkzdpviyuuyhrus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018540.1877868-2520-248223673728555/AnsiballZ_file.py'
Jan 21 18:02:20 compute-0 sudo[157948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:20 compute-0 python3.9[157950]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:20 compute-0 sudo[157948]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:21 compute-0 sudo[158100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zleviyvghrsoyzwxeqjsngtqzlutmkhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018540.869674-2536-59737911590616/AnsiballZ_command.py'
Jan 21 18:02:21 compute-0 sudo[158100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:21 compute-0 python3.9[158102]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:02:21 compute-0 sudo[158100]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:21 compute-0 sudo[158255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhbfhsfcsxsgrultsduxplpdmzjljaoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018541.527436-2552-126567658102698/AnsiballZ_blockinfile.py'
Jan 21 18:02:21 compute-0 sudo[158255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:22 compute-0 python3.9[158257]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:22 compute-0 sudo[158255]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:22 compute-0 sudo[158407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryhczuphcvfyywsgeqdedwbhqbqczlwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018542.48269-2570-58023493667381/AnsiballZ_command.py'
Jan 21 18:02:22 compute-0 sudo[158407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:22 compute-0 python3.9[158409]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:02:23 compute-0 sudo[158407]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:23 compute-0 sudo[158560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eljnaildvtengrpljydxxeipxhreejiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018543.2345352-2586-46214043079726/AnsiballZ_stat.py'
Jan 21 18:02:23 compute-0 sudo[158560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:23 compute-0 python3.9[158562]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:02:23 compute-0 sudo[158560]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:24 compute-0 podman[158589]: 2026-01-21 18:02:24.008579616 +0000 UTC m=+0.057706250 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 18:02:24 compute-0 podman[158590]: 2026-01-21 18:02:24.061916901 +0000 UTC m=+0.110521642 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:02:24 compute-0 sudo[158759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsgmbmidggknmklpocmiyjyhqbxqjyzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018543.9822705-2602-188474419418541/AnsiballZ_command.py'
Jan 21 18:02:24 compute-0 sudo[158759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:24 compute-0 python3.9[158761]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:02:24 compute-0 sudo[158759]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:24 compute-0 sudo[158914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjursxiptjhizinmnogayllwptwcscuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018544.718891-2618-30053774670120/AnsiballZ_file.py'
Jan 21 18:02:24 compute-0 sudo[158914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:25 compute-0 python3.9[158916]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:25 compute-0 sudo[158914]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:25 compute-0 sudo[159066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbozfzpzgbxxsntfvedxgvzumgvhxpaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018545.4190195-2634-199767574317209/AnsiballZ_stat.py'
Jan 21 18:02:25 compute-0 sudo[159066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:25 compute-0 python3.9[159068]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:25 compute-0 sudo[159066]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:26 compute-0 sudo[159189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvnndbwmvjqjubmslkvvinwvcxmtadok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018545.4190195-2634-199767574317209/AnsiballZ_copy.py'
Jan 21 18:02:26 compute-0 sudo[159189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:26 compute-0 python3.9[159191]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018545.4190195-2634-199767574317209/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:26 compute-0 sudo[159189]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:26 compute-0 sudo[159341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azqwcbgrscsyklqcwfzuiejkubefxepr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018546.633261-2664-156273350124487/AnsiballZ_stat.py'
Jan 21 18:02:26 compute-0 sudo[159341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:27 compute-0 python3.9[159343]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:27 compute-0 sudo[159341]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:27 compute-0 sudo[159464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cltxoyoqacfrzqvrljkiymldoqvcsajl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018546.633261-2664-156273350124487/AnsiballZ_copy.py'
Jan 21 18:02:27 compute-0 sudo[159464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:27 compute-0 python3.9[159466]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018546.633261-2664-156273350124487/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:27 compute-0 sudo[159464]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:28 compute-0 sudo[159616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeidnykwwtxpwameoiagblwmvqrmiopt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018547.872575-2694-232799339496339/AnsiballZ_stat.py'
Jan 21 18:02:28 compute-0 sudo[159616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:28 compute-0 python3.9[159618]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:28 compute-0 sudo[159616]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:28 compute-0 sudo[159739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqqskpgvbbubrsedwfirtaoyumiogzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018547.872575-2694-232799339496339/AnsiballZ_copy.py'
Jan 21 18:02:28 compute-0 sudo[159739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:28 compute-0 python3.9[159741]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018547.872575-2694-232799339496339/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:28 compute-0 sudo[159739]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:29 compute-0 sudo[159891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrhmpdjxcfbyjzlvbbmkuiomjfyqnxuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018549.1439884-2724-53012193267134/AnsiballZ_systemd.py'
Jan 21 18:02:29 compute-0 sudo[159891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:29 compute-0 python3.9[159893]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:02:29 compute-0 systemd[1]: Reloading.
Jan 21 18:02:29 compute-0 systemd-rc-local-generator[159922]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:02:29 compute-0 systemd-sysv-generator[159926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:02:30 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Jan 21 18:02:30 compute-0 sudo[159891]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:30 compute-0 sudo[160083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceilshjrjxcztnizavttktavffudkabs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018550.5561945-2740-79984416842415/AnsiballZ_systemd.py'
Jan 21 18:02:30 compute-0 sudo[160083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:31 compute-0 python3.9[160085]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 21 18:02:31 compute-0 systemd[1]: Reloading.
Jan 21 18:02:31 compute-0 systemd-rc-local-generator[160107]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:02:31 compute-0 systemd-sysv-generator[160113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:02:31 compute-0 systemd[1]: Reloading.
Jan 21 18:02:31 compute-0 systemd-rc-local-generator[160151]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:02:31 compute-0 systemd-sysv-generator[160154]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:02:31 compute-0 sudo[160083]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:32 compute-0 sshd-session[138369]: Connection closed by 192.168.122.30 port 45328
Jan 21 18:02:32 compute-0 sshd-session[138366]: pam_unix(sshd:session): session closed for user zuul
Jan 21 18:02:32 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Jan 21 18:02:32 compute-0 systemd[1]: session-24.scope: Consumed 1min 19.787s CPU time.
Jan 21 18:02:32 compute-0 systemd-logind[782]: Session 24 logged out. Waiting for processes to exit.
Jan 21 18:02:32 compute-0 systemd-logind[782]: Removed session 24.
Jan 21 18:02:38 compute-0 sshd-session[160183]: Accepted publickey for zuul from 192.168.122.30 port 53706 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 18:02:38 compute-0 systemd-logind[782]: New session 25 of user zuul.
Jan 21 18:02:38 compute-0 systemd[1]: Started Session 25 of User zuul.
Jan 21 18:02:38 compute-0 sshd-session[160183]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 18:02:39 compute-0 sshd-session[160187]: Invalid user polkadot from 64.227.98.100 port 46516
Jan 21 18:02:39 compute-0 sshd-session[160187]: Connection closed by invalid user polkadot 64.227.98.100 port 46516 [preauth]
Jan 21 18:02:39 compute-0 python3.9[160338]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:02:41 compute-0 python3.9[160492]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:02:41 compute-0 network[160509]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:02:41 compute-0 network[160510]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:02:41 compute-0 network[160511]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:02:44 compute-0 sudo[160780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnzsgxynstjrnffqbyxgyxbofwqdjojj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018564.6962655-69-258307246636601/AnsiballZ_setup.py'
Jan 21 18:02:44 compute-0 sudo[160780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:45 compute-0 python3.9[160782]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:02:45 compute-0 sudo[160780]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:46 compute-0 sudo[160864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyzrfbtyotsnemnwzjansuuebaouzwxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018564.6962655-69-258307246636601/AnsiballZ_dnf.py'
Jan 21 18:02:46 compute-0 sudo[160864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:46 compute-0 python3.9[160866]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:02:51 compute-0 sudo[160864]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:52 compute-0 sudo[161017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofehjpqzwvjzwnhszljusmplpjtdlvqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018571.9860227-93-229075506308029/AnsiballZ_stat.py'
Jan 21 18:02:52 compute-0 sudo[161017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:52 compute-0 python3.9[161019]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:02:52 compute-0 sudo[161017]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:53 compute-0 sudo[161169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoqjqnwfibymadhwagkfquarcspnkdmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018572.9356327-113-199844394138781/AnsiballZ_command.py'
Jan 21 18:02:53 compute-0 sudo[161169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:53 compute-0 python3.9[161171]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:02:53 compute-0 sudo[161169]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:54 compute-0 sudo[161343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaqjktxqxaqunrbqfgrcqvxxfwtoaiyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018573.9395366-133-158979882990374/AnsiballZ_stat.py'
Jan 21 18:02:54 compute-0 sudo[161343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:54 compute-0 podman[161297]: 2026-01-21 18:02:54.276651553 +0000 UTC m=+0.066317742 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 21 18:02:54 compute-0 podman[161296]: 2026-01-21 18:02:54.298696616 +0000 UTC m=+0.089260377 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Jan 21 18:02:54 compute-0 python3.9[161361]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:02:54 compute-0 sudo[161343]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:54 compute-0 sudo[161519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-purnstktgnswuzmhsnjzjdeytoacdpcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018574.666382-149-81606873529170/AnsiballZ_command.py'
Jan 21 18:02:54 compute-0 sudo[161519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:55 compute-0 python3.9[161521]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:02:55 compute-0 sudo[161519]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:55 compute-0 sudo[161672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxitzouyscmbmgkpazgqjarbspdkpifz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018575.375514-165-239547504791671/AnsiballZ_stat.py'
Jan 21 18:02:55 compute-0 sudo[161672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:55 compute-0 python3.9[161674]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:02:55 compute-0 sudo[161672]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:56 compute-0 sudo[161795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hznoxggcmfmzezwvhpgeahlmsszriacr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018575.375514-165-239547504791671/AnsiballZ_copy.py'
Jan 21 18:02:56 compute-0 sudo[161795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:56 compute-0 python3.9[161797]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018575.375514-165-239547504791671/.source.iscsi _original_basename=.mj0ozhwh follow=False checksum=f5ac357d3628fc6f9c0050a256796a482f621fec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:56 compute-0 sudo[161795]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:57 compute-0 sudo[161947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjtnofwsetgjxnuxklocswiihfifwlvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018576.715945-195-72294021280093/AnsiballZ_file.py'
Jan 21 18:02:57 compute-0 sudo[161947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:57 compute-0 python3.9[161949]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:57 compute-0 sudo[161947]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:58 compute-0 sudo[162099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcutqepamtxcyjabjuiifeynfkxedsrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018577.6108747-211-273370234644859/AnsiballZ_lineinfile.py'
Jan 21 18:02:58 compute-0 sudo[162099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:58 compute-0 python3.9[162101]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:02:58 compute-0 sudo[162099]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:59 compute-0 sudo[162251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhodzeiokkytdwajskvajoarokqtepjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018578.4846418-229-244584178678008/AnsiballZ_systemd_service.py'
Jan 21 18:02:59 compute-0 sudo[162251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:02:59 compute-0 python3.9[162253]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:02:59 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 21 18:02:59 compute-0 sudo[162251]: pam_unix(sudo:session): session closed for user root
Jan 21 18:02:59 compute-0 sudo[162407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkylsphaolsmypnakmehazurqgzqndbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018579.676306-245-191776155671785/AnsiballZ_systemd_service.py'
Jan 21 18:02:59 compute-0 sudo[162407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:00 compute-0 python3.9[162409]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:00 compute-0 systemd[1]: Reloading.
Jan 21 18:03:00 compute-0 systemd-rc-local-generator[162436]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:03:00 compute-0 systemd-sysv-generator[162440]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:03:00 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 21 18:03:00 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 21 18:03:00 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Jan 21 18:03:00 compute-0 systemd[1]: Started Open-iSCSI.
Jan 21 18:03:00 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 21 18:03:00 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 21 18:03:00 compute-0 sudo[162407]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:01 compute-0 python3.9[162606]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:03:01 compute-0 network[162623]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:03:01 compute-0 network[162624]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:03:01 compute-0 network[162625]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:03:05 compute-0 sudo[162894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agjkqaquhjfawhljqyjtspfoymbqnrts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018585.0140054-291-164683496036253/AnsiballZ_dnf.py'
Jan 21 18:03:05 compute-0 sudo[162894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:05 compute-0 python3.9[162896]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:03:08 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:03:08 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:03:08 compute-0 systemd[1]: Reloading.
Jan 21 18:03:08 compute-0 systemd-rc-local-generator[162942]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:03:08 compute-0 systemd-sysv-generator[162948]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:03:08 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:03:09 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:03:09 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:03:09 compute-0 systemd[1]: run-reffbbb0552694f978bace2249fa37a22.service: Deactivated successfully.
Jan 21 18:03:09 compute-0 sudo[162894]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:09 compute-0 sudo[163211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqmdtrgcfkvgtztgvwjkflvqucebdnfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018589.704234-309-132541955347544/AnsiballZ_file.py'
Jan 21 18:03:09 compute-0 sudo[163211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:10 compute-0 python3.9[163213]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 21 18:03:10 compute-0 sudo[163211]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:10 compute-0 sudo[163363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efemoschivytzcpudknlgobtxserysgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018590.3943865-325-121861924468301/AnsiballZ_modprobe.py'
Jan 21 18:03:10 compute-0 sudo[163363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:10 compute-0 python3.9[163365]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 21 18:03:11 compute-0 sudo[163363]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:11 compute-0 sudo[163519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfhouensvjbxwdzouqvlzxqcqcrusuxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018591.4430058-341-147187546420738/AnsiballZ_stat.py'
Jan 21 18:03:11 compute-0 sudo[163519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:11 compute-0 python3.9[163521]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:03:11 compute-0 sudo[163519]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:12 compute-0 sudo[163642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asppjeaglkeuvhdrnullfdcrivpyrjza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018591.4430058-341-147187546420738/AnsiballZ_copy.py'
Jan 21 18:03:12 compute-0 sudo[163642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:12 compute-0 python3.9[163644]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018591.4430058-341-147187546420738/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:12 compute-0 sudo[163642]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:13 compute-0 sudo[163794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvgyodtiajrqkcxgnkulqgfxmqtohruu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018592.8800046-373-185612238637696/AnsiballZ_lineinfile.py'
Jan 21 18:03:13 compute-0 sudo[163794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:13 compute-0 python3.9[163796]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:13 compute-0 sudo[163794]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:14 compute-0 sudo[163946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgnncenbhjuhphvahvyznuelakqqkwtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018593.6048746-389-132170838269897/AnsiballZ_systemd.py'
Jan 21 18:03:14 compute-0 sudo[163946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:14 compute-0 python3.9[163948]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:03:14 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 21 18:03:14 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 21 18:03:14 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 21 18:03:14 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 21 18:03:14 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 21 18:03:14 compute-0 sudo[163946]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:15 compute-0 sudo[164102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpodqnpdncsyncvqpasvhoxyxpgiucpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018594.8198266-405-21683022497389/AnsiballZ_command.py'
Jan 21 18:03:15 compute-0 sudo[164102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:15 compute-0 python3.9[164104]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:03:15 compute-0 sudo[164102]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:15 compute-0 sudo[164255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxeincilrvmqhifvdlkeshflexcycmux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018595.6554487-425-91877204374456/AnsiballZ_stat.py'
Jan 21 18:03:15 compute-0 sudo[164255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:16 compute-0 python3.9[164257]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:03:16 compute-0 sudo[164255]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:16 compute-0 sudo[164407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmediaaxlbywasejqqcovcwexvmxmpfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018596.4550428-443-59491177919414/AnsiballZ_stat.py'
Jan 21 18:03:16 compute-0 sudo[164407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:16 compute-0 python3.9[164409]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:03:16 compute-0 sudo[164407]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:17 compute-0 sudo[164530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waidmnfauqwykbbtrhfrxmfwvbxacbzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018596.4550428-443-59491177919414/AnsiballZ_copy.py'
Jan 21 18:03:17 compute-0 sudo[164530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:17 compute-0 python3.9[164532]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018596.4550428-443-59491177919414/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:17 compute-0 sudo[164530]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:18 compute-0 sudo[164682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwqdmlojkgeuycavvyuslkmkunrdpapf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018597.813658-473-226225141226919/AnsiballZ_command.py'
Jan 21 18:03:18 compute-0 sudo[164682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:18 compute-0 python3.9[164684]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:03:18 compute-0 sudo[164682]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:18 compute-0 sudo[164835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnwjpsrlicdzbheiemztzhobbxvbtdvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018598.5490887-489-279506045252889/AnsiballZ_lineinfile.py'
Jan 21 18:03:18 compute-0 sudo[164835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:19 compute-0 python3.9[164837]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:19 compute-0 sudo[164835]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:19 compute-0 sudo[164987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmailpzjqwvckwcgrujypedxwzhfwzux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018599.3304732-505-216288303985329/AnsiballZ_replace.py'
Jan 21 18:03:19 compute-0 sudo[164987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:19 compute-0 python3.9[164989]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:19 compute-0 sudo[164987]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:03:20.057 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:03:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:03:20.059 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:03:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:03:20.059 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:03:20 compute-0 sudo[165139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjyjlxvoztneqcxarvoysryakvtcbahn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018600.15606-521-256332648570247/AnsiballZ_replace.py'
Jan 21 18:03:20 compute-0 sudo[165139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:20 compute-0 python3.9[165141]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:20 compute-0 sudo[165139]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:21 compute-0 sudo[165291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxogpjimaalzdennzsybiqbwsipuowbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018600.808902-539-31571383197852/AnsiballZ_lineinfile.py'
Jan 21 18:03:21 compute-0 sudo[165291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:21 compute-0 python3.9[165293]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:21 compute-0 sudo[165291]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:21 compute-0 sudo[165443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftperqztyyprlmuztjdcgzwrgftshvuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018601.7112823-539-117606864178398/AnsiballZ_lineinfile.py'
Jan 21 18:03:21 compute-0 sudo[165443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:22 compute-0 python3.9[165445]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:22 compute-0 sudo[165443]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:22 compute-0 sudo[165595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unljjzyadpahzimbymbxkqxucqmlgbhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018602.3241658-539-106594895685519/AnsiballZ_lineinfile.py'
Jan 21 18:03:22 compute-0 sudo[165595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:22 compute-0 python3.9[165597]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:22 compute-0 sudo[165595]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:23 compute-0 sudo[165747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mytxkbwbwbiuqjtyhyzlicmwjfwfhtkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018602.9307475-539-70085806695120/AnsiballZ_lineinfile.py'
Jan 21 18:03:23 compute-0 sudo[165747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:23 compute-0 python3.9[165749]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:23 compute-0 sudo[165747]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:23 compute-0 sudo[165899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tngxibmanlmwlsspqafxdpbacmearzsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018603.6070936-597-193895857224691/AnsiballZ_stat.py'
Jan 21 18:03:23 compute-0 sudo[165899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:24 compute-0 python3.9[165901]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:03:24 compute-0 sudo[165899]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:24 compute-0 sudo[166075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxljgjlrjzhvluikbtsvlhsroggtgpcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018604.2576435-613-128849536519077/AnsiballZ_command.py'
Jan 21 18:03:24 compute-0 sudo[166075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:24 compute-0 podman[166028]: 2026-01-21 18:03:24.538650674 +0000 UTC m=+0.059313232 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:03:24 compute-0 podman[166027]: 2026-01-21 18:03:24.621746374 +0000 UTC m=+0.141807068 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:03:24 compute-0 python3.9[166094]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:03:24 compute-0 sudo[166075]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:25 compute-0 sudo[166252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpircronzfqeqfnccwlonjoqnwayyoay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018604.9767396-631-277033792607966/AnsiballZ_systemd_service.py'
Jan 21 18:03:25 compute-0 sudo[166252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:25 compute-0 python3.9[166254]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:25 compute-0 systemd[1]: Listening on multipathd control socket.
Jan 21 18:03:25 compute-0 sudo[166252]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:26 compute-0 sudo[166408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsidrpwswhxjsyfflaeflnnncvcchhhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018605.8428211-647-204497461185770/AnsiballZ_systemd_service.py'
Jan 21 18:03:26 compute-0 sudo[166408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:26 compute-0 python3.9[166410]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:26 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 21 18:03:26 compute-0 udevadm[166415]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 21 18:03:26 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 21 18:03:26 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 21 18:03:26 compute-0 multipathd[166418]: --------start up--------
Jan 21 18:03:26 compute-0 multipathd[166418]: read /etc/multipath.conf
Jan 21 18:03:26 compute-0 multipathd[166418]: path checkers start up
Jan 21 18:03:26 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 21 18:03:26 compute-0 sudo[166408]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:27 compute-0 sudo[166575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaxgxcixzwczptyyljsxxvxheozkygda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018607.023344-671-151639881641625/AnsiballZ_file.py'
Jan 21 18:03:27 compute-0 sudo[166575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:27 compute-0 python3.9[166577]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 21 18:03:27 compute-0 sudo[166575]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:27 compute-0 sudo[166727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhynjvtksbzykzlhnxkrrkhbkiwjrheq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018607.7204428-687-131418712119337/AnsiballZ_modprobe.py'
Jan 21 18:03:27 compute-0 sudo[166727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:28 compute-0 python3.9[166729]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 21 18:03:28 compute-0 kernel: Key type psk registered
Jan 21 18:03:28 compute-0 sudo[166727]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:28 compute-0 sudo[166890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giggkwkvusqrjdnlhyblepnnrlmsefqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018608.3997016-703-149808866867370/AnsiballZ_stat.py'
Jan 21 18:03:28 compute-0 sudo[166890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:28 compute-0 python3.9[166892]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:03:28 compute-0 sudo[166890]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:29 compute-0 sudo[167013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foxopkplpfsbhsxdmaqpkbamxwishzgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018608.3997016-703-149808866867370/AnsiballZ_copy.py'
Jan 21 18:03:29 compute-0 sudo[167013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:29 compute-0 python3.9[167015]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018608.3997016-703-149808866867370/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:29 compute-0 sudo[167013]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:29 compute-0 sudo[167165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdijuppfncpnfmzqywatjvkemtauqmfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018609.686214-735-196374717116901/AnsiballZ_lineinfile.py'
Jan 21 18:03:29 compute-0 sudo[167165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:30 compute-0 python3.9[167167]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:30 compute-0 sudo[167165]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:30 compute-0 sudo[167317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srxqwzptbxhvkkxnvtnemynwmqxoormb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018610.3000052-751-95287375391663/AnsiballZ_systemd.py'
Jan 21 18:03:30 compute-0 sudo[167317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:30 compute-0 python3.9[167319]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:03:30 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 21 18:03:30 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 21 18:03:30 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 21 18:03:30 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 21 18:03:30 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 21 18:03:30 compute-0 sudo[167317]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:31 compute-0 sudo[167473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wncqetqeljntcuxjiaiydomkkgihhfkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018611.169686-767-87326799368228/AnsiballZ_dnf.py'
Jan 21 18:03:31 compute-0 sudo[167473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:31 compute-0 python3.9[167475]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:03:33 compute-0 systemd[1]: Reloading.
Jan 21 18:03:33 compute-0 systemd-rc-local-generator[167502]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:03:33 compute-0 systemd-sysv-generator[167506]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:03:34 compute-0 systemd[1]: Reloading.
Jan 21 18:03:34 compute-0 systemd-rc-local-generator[167542]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:03:34 compute-0 systemd-sysv-generator[167546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:03:34 compute-0 systemd-logind[782]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 21 18:03:34 compute-0 systemd-logind[782]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 21 18:03:34 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:03:34 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:03:34 compute-0 systemd[1]: Reloading.
Jan 21 18:03:34 compute-0 systemd-rc-local-generator[167636]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:03:34 compute-0 systemd-sysv-generator[167639]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:03:34 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:03:35 compute-0 sudo[167473]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:35 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:03:35 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:03:35 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.420s CPU time.
Jan 21 18:03:35 compute-0 systemd[1]: run-r2aac4f1d71284aa8892613cec62aa119.service: Deactivated successfully.
Jan 21 18:03:36 compute-0 sudo[168937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txnewrxmgogfhpriauupimfojbqdmhbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018615.8582141-783-148412011033959/AnsiballZ_systemd_service.py'
Jan 21 18:03:36 compute-0 sudo[168937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:36 compute-0 python3.9[168939]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:03:36 compute-0 systemd[1]: Stopping Open-iSCSI...
Jan 21 18:03:36 compute-0 iscsid[162448]: iscsid shutting down.
Jan 21 18:03:36 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Jan 21 18:03:36 compute-0 systemd[1]: Stopped Open-iSCSI.
Jan 21 18:03:36 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 21 18:03:36 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 21 18:03:36 compute-0 systemd[1]: Started Open-iSCSI.
Jan 21 18:03:36 compute-0 sudo[168937]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:37 compute-0 sudo[169093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwcvqxvxzfbdqgwbzdtqlwsjppzbkoyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018616.7746167-799-35452695027560/AnsiballZ_systemd_service.py'
Jan 21 18:03:37 compute-0 sudo[169093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:37 compute-0 python3.9[169095]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:03:37 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 21 18:03:37 compute-0 multipathd[166418]: exit (signal)
Jan 21 18:03:37 compute-0 multipathd[166418]: --------shut down-------
Jan 21 18:03:37 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Jan 21 18:03:37 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 21 18:03:37 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 21 18:03:37 compute-0 multipathd[169102]: --------start up--------
Jan 21 18:03:37 compute-0 multipathd[169102]: read /etc/multipath.conf
Jan 21 18:03:37 compute-0 multipathd[169102]: path checkers start up
Jan 21 18:03:37 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 21 18:03:37 compute-0 sudo[169093]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:38 compute-0 python3.9[169259]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:03:38 compute-0 sudo[169413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otyacpmadrdlmsoacikyqqsxhhuacsfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018618.7273192-834-198332944225326/AnsiballZ_file.py'
Jan 21 18:03:38 compute-0 sudo[169413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:39 compute-0 python3.9[169415]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:39 compute-0 sudo[169413]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:40 compute-0 sudo[169565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuquuxomrmlzvutytuwkaasiktejhuml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018619.7649205-856-125072333819135/AnsiballZ_systemd_service.py'
Jan 21 18:03:40 compute-0 sudo[169565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:40 compute-0 python3.9[169567]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:03:40 compute-0 systemd[1]: Reloading.
Jan 21 18:03:40 compute-0 systemd-rc-local-generator[169595]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:03:40 compute-0 systemd-sysv-generator[169598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:03:40 compute-0 sudo[169565]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:41 compute-0 python3.9[169752]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:03:41 compute-0 network[169769]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:03:41 compute-0 network[169770]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:03:41 compute-0 network[169771]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:03:44 compute-0 sudo[170041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyjomncjaveqpckjpmltzmtrxdzelleg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018624.4118989-894-31321708359342/AnsiballZ_systemd_service.py'
Jan 21 18:03:44 compute-0 sudo[170041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:44 compute-0 python3.9[170043]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:45 compute-0 sudo[170041]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:45 compute-0 sudo[170194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcpkmdqrqujcsawshcittouuctkhtkge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018625.1352687-894-108071506789095/AnsiballZ_systemd_service.py'
Jan 21 18:03:45 compute-0 sudo[170194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:45 compute-0 python3.9[170196]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:45 compute-0 sudo[170194]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:46 compute-0 sudo[170347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tipdhflutalcjdgwwuiijsvognikwhok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018625.8688262-894-26418026826884/AnsiballZ_systemd_service.py'
Jan 21 18:03:46 compute-0 sudo[170347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:46 compute-0 python3.9[170349]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:46 compute-0 sudo[170347]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:46 compute-0 sudo[170500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whgrzgwlfujzkiofgskywetedvqandgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018626.6932273-894-104162441832113/AnsiballZ_systemd_service.py'
Jan 21 18:03:46 compute-0 sudo[170500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:47 compute-0 python3.9[170502]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:47 compute-0 sudo[170500]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:47 compute-0 sudo[170653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axgnogqizitubqvndarqxflfivixpwtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018627.4131434-894-145318301445814/AnsiballZ_systemd_service.py'
Jan 21 18:03:47 compute-0 sudo[170653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:47 compute-0 python3.9[170655]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:47 compute-0 sudo[170653]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:48 compute-0 sudo[170806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnbdjhukwvdyfkozeubkzzalacycjwtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018628.094026-894-227309137451365/AnsiballZ_systemd_service.py'
Jan 21 18:03:48 compute-0 sudo[170806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:48 compute-0 python3.9[170808]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:48 compute-0 sudo[170806]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:49 compute-0 sudo[170959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxanagbxaftgviazcvkshajkjhkyngph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018628.7954664-894-147859151429468/AnsiballZ_systemd_service.py'
Jan 21 18:03:49 compute-0 sudo[170959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:49 compute-0 python3.9[170961]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:50 compute-0 sudo[170959]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:50 compute-0 sudo[171112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbhrvvrtkbbajcgzfgseergvaghrrquc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018630.6222007-894-164547114363222/AnsiballZ_systemd_service.py'
Jan 21 18:03:50 compute-0 sudo[171112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:51 compute-0 python3.9[171114]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:03:51 compute-0 sudo[171112]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:51 compute-0 sudo[171265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvzlptpjabqyekmkrsqbatdukezwjzmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018631.6267009-1012-176826631730129/AnsiballZ_file.py'
Jan 21 18:03:51 compute-0 sudo[171265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:52 compute-0 python3.9[171267]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:52 compute-0 sudo[171265]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:52 compute-0 sudo[171417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcljhkdywvelufzpwuodjkflmbziltge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018632.3009033-1012-170016125445898/AnsiballZ_file.py'
Jan 21 18:03:52 compute-0 sudo[171417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:52 compute-0 python3.9[171419]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:52 compute-0 sudo[171417]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:53 compute-0 sudo[171569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvcuzrvqamqquzumumcdufssmsjlvzxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018632.8683364-1012-270023779900334/AnsiballZ_file.py'
Jan 21 18:03:53 compute-0 sudo[171569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:53 compute-0 python3.9[171571]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:53 compute-0 sudo[171569]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:53 compute-0 sudo[171721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flnjxmgbkrktchwjcvyqmaldabaiybzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018633.4712715-1012-171422579047372/AnsiballZ_file.py'
Jan 21 18:03:53 compute-0 sudo[171721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:53 compute-0 python3.9[171723]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:53 compute-0 sudo[171721]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:54 compute-0 sudo[171873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvigshruhuasuveuhvgudpzqzjuzficj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018634.039275-1012-239485629211503/AnsiballZ_file.py'
Jan 21 18:03:54 compute-0 sudo[171873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:54 compute-0 python3.9[171875]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:54 compute-0 sudo[171873]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:54 compute-0 sudo[172048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rybcixcncflwoyeozrnlmvwkwpqflcwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018634.598283-1012-194716681451791/AnsiballZ_file.py'
Jan 21 18:03:54 compute-0 sudo[172048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:54 compute-0 podman[172000]: 2026-01-21 18:03:54.88437711 +0000 UTC m=+0.059088276 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:03:54 compute-0 podman[171999]: 2026-01-21 18:03:54.937072653 +0000 UTC m=+0.112246940 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 18:03:55 compute-0 python3.9[172060]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:55 compute-0 sudo[172048]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:55 compute-0 sudo[172219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nawumlxmrmfbiatgdifxtphmepufiaqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018635.1922674-1012-105998045327733/AnsiballZ_file.py'
Jan 21 18:03:55 compute-0 sudo[172219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:55 compute-0 python3.9[172221]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:55 compute-0 sudo[172219]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:56 compute-0 sudo[172371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeodcfbghkbaaskeckehjjiapetwsnak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018635.7808058-1012-54833187646779/AnsiballZ_file.py'
Jan 21 18:03:56 compute-0 sudo[172371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:56 compute-0 python3.9[172373]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:56 compute-0 sudo[172371]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:56 compute-0 sudo[172523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbvemudqwrvniloebznxbcdxlxzqevww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018636.4162116-1126-187017443439547/AnsiballZ_file.py'
Jan 21 18:03:56 compute-0 sudo[172523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:56 compute-0 python3.9[172525]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:56 compute-0 sudo[172523]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:57 compute-0 sudo[172675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrxdvjhvlmgkfqcbjrdvazfyjpfomdac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018637.095222-1126-224596572338915/AnsiballZ_file.py'
Jan 21 18:03:57 compute-0 sudo[172675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:57 compute-0 python3.9[172677]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:57 compute-0 sudo[172675]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:57 compute-0 sudo[172827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjjpffyacwoikciiqzammxqjifqwenff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018637.6753693-1126-160989309679600/AnsiballZ_file.py'
Jan 21 18:03:57 compute-0 sudo[172827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:58 compute-0 python3.9[172829]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:58 compute-0 sudo[172827]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:58 compute-0 sudo[172979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qleymrsczhgkwwfdexnvuthnpryexwgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018638.296301-1126-94192292688826/AnsiballZ_file.py'
Jan 21 18:03:58 compute-0 sudo[172979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:58 compute-0 python3.9[172981]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:58 compute-0 sudo[172979]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:59 compute-0 sudo[173131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drcbukfvmyzduoiurfdkxquinnnvxsct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018639.0154302-1126-233762684062940/AnsiballZ_file.py'
Jan 21 18:03:59 compute-0 sudo[173131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:03:59 compute-0 python3.9[173133]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:03:59 compute-0 sudo[173131]: pam_unix(sudo:session): session closed for user root
Jan 21 18:03:59 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 21 18:03:59 compute-0 sudo[173284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxwjnsrstgmchlrujkpjjkvtcnjxbcnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018639.6396606-1126-44612420309569/AnsiballZ_file.py'
Jan 21 18:03:59 compute-0 sudo[173284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:00 compute-0 python3.9[173286]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:04:00 compute-0 sudo[173284]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:00 compute-0 sudo[173436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inhvhimfqniapwozqngdngmhpddgapjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018640.2764156-1126-156322033258665/AnsiballZ_file.py'
Jan 21 18:04:00 compute-0 sudo[173436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:00 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 18:04:00 compute-0 python3.9[173438]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:04:00 compute-0 sudo[173436]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:01 compute-0 sudo[173589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjkoxxnvktthcfrzxjmaietdaakpopdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018640.8256702-1126-50324197476037/AnsiballZ_file.py'
Jan 21 18:04:01 compute-0 sudo[173589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:01 compute-0 python3.9[173591]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:04:01 compute-0 sudo[173589]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:01 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 21 18:04:01 compute-0 sudo[173742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqumsjcjkfvlyzvayalhgziwdhzvucqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018641.5322115-1242-104289787077499/AnsiballZ_command.py'
Jan 21 18:04:01 compute-0 sudo[173742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:01 compute-0 python3.9[173744]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:04:02 compute-0 sudo[173742]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:02 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 21 18:04:02 compute-0 python3.9[173896]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:04:03 compute-0 sudo[174047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxlsmvdzrfamoseksosjrialltsaziou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018643.1305985-1278-188992949808657/AnsiballZ_systemd_service.py'
Jan 21 18:04:03 compute-0 sudo[174047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:03 compute-0 python3.9[174049]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:04:03 compute-0 systemd[1]: Reloading.
Jan 21 18:04:03 compute-0 systemd-rc-local-generator[174076]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:04:03 compute-0 systemd-sysv-generator[174080]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:04:04 compute-0 sudo[174047]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:04 compute-0 sudo[174234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ninnldzagvgjflcophtkhlpbuvbdkhgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018644.1908016-1294-163800784686850/AnsiballZ_command.py'
Jan 21 18:04:04 compute-0 sudo[174234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:04 compute-0 python3.9[174236]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:04:04 compute-0 sudo[174234]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:05 compute-0 sudo[174387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzdkutyaumcwxqhmdaiswbbfwoxzxpog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018644.844551-1294-194058017121918/AnsiballZ_command.py'
Jan 21 18:04:05 compute-0 sudo[174387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:05 compute-0 python3.9[174389]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:04:05 compute-0 sudo[174387]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:05 compute-0 sudo[174540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yelxyxwihkyeezswprxqzcjdxqkeblwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018645.4408083-1294-104016201077166/AnsiballZ_command.py'
Jan 21 18:04:05 compute-0 sudo[174540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:05 compute-0 python3.9[174542]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:04:05 compute-0 sudo[174540]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:06 compute-0 sudo[174693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgglupszbkkqdmquirfeswnbdbkqtoat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018646.0199609-1294-134934684241867/AnsiballZ_command.py'
Jan 21 18:04:06 compute-0 sudo[174693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:06 compute-0 python3.9[174695]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:04:06 compute-0 sudo[174693]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:06 compute-0 sudo[174846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpphntpwhrsfwlbkwqqjuhdpkbibpzog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018646.6339056-1294-253525474700738/AnsiballZ_command.py'
Jan 21 18:04:06 compute-0 sudo[174846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:07 compute-0 python3.9[174848]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:04:07 compute-0 sudo[174846]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:07 compute-0 sudo[174999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiofwdjusvoalwncpitnfkjabxnnzxaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018647.2611423-1294-159606755556045/AnsiballZ_command.py'
Jan 21 18:04:07 compute-0 sudo[174999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:07 compute-0 python3.9[175001]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:04:07 compute-0 sudo[174999]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:08 compute-0 sudo[175152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipbneksmhajhemlnuidlelrnvtqfgglj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018647.8142676-1294-119939143571655/AnsiballZ_command.py'
Jan 21 18:04:08 compute-0 sudo[175152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:08 compute-0 python3.9[175154]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:04:08 compute-0 sudo[175152]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:08 compute-0 sudo[175305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwbgtssebgjdwuztmcthljdulbkampdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018648.4267387-1294-232971848535144/AnsiballZ_command.py'
Jan 21 18:04:08 compute-0 sudo[175305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:08 compute-0 python3.9[175307]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:04:08 compute-0 sudo[175305]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:10 compute-0 sudo[175458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnalpysctjursfyrjbdbomoipochlekj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018650.2916691-1437-3202094390955/AnsiballZ_file.py'
Jan 21 18:04:10 compute-0 sudo[175458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:10 compute-0 python3.9[175460]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:10 compute-0 sudo[175458]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:11 compute-0 sudo[175610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enzbllmmdhcneumdqxjfqgfdpffvcobt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018650.9620862-1437-175344000627466/AnsiballZ_file.py'
Jan 21 18:04:11 compute-0 sudo[175610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:11 compute-0 python3.9[175612]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:11 compute-0 sudo[175610]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:11 compute-0 sudo[175762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeswjfvaiisthjookorfgzgoxqhipaop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018651.586615-1437-3427194351801/AnsiballZ_file.py'
Jan 21 18:04:11 compute-0 sudo[175762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:12 compute-0 python3.9[175764]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:12 compute-0 sudo[175762]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:12 compute-0 sudo[175914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzywofpqeruvckaaptnhilvenuexcmvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018652.238014-1481-149719456069245/AnsiballZ_file.py'
Jan 21 18:04:12 compute-0 sudo[175914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:12 compute-0 python3.9[175916]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:12 compute-0 sudo[175914]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:13 compute-0 sudo[176066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fucuvrnhekjcmeohkkgioukumgrnxdiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018652.8616939-1481-185237744845935/AnsiballZ_file.py'
Jan 21 18:04:13 compute-0 sudo[176066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:13 compute-0 python3.9[176068]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:13 compute-0 sudo[176066]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:13 compute-0 sudo[176218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txvydelcauqkiqagslnjnwvslogtvedc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018653.535204-1481-147598095934180/AnsiballZ_file.py'
Jan 21 18:04:13 compute-0 sudo[176218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:13 compute-0 python3.9[176220]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:14 compute-0 sudo[176218]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:14 compute-0 sudo[176370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dewrwacxaxtepwheaaeqbgoeqfqysgax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018654.1537569-1481-143235099857097/AnsiballZ_file.py'
Jan 21 18:04:14 compute-0 sudo[176370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:14 compute-0 python3.9[176372]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:14 compute-0 sudo[176370]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:15 compute-0 sudo[176522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfhdunwucjvtyfqqtmglfezyhtoaydnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018654.82374-1481-48246016773949/AnsiballZ_file.py'
Jan 21 18:04:15 compute-0 sudo[176522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:15 compute-0 python3.9[176524]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:15 compute-0 sudo[176522]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:15 compute-0 sudo[176674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwtnwodzjgizqwgwvmpvtuabkrgparva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018655.4184492-1481-255490272364025/AnsiballZ_file.py'
Jan 21 18:04:15 compute-0 sudo[176674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:15 compute-0 python3.9[176676]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:15 compute-0 sudo[176674]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:16 compute-0 sudo[176826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdzjibideoutpmqnbtzimmlwemjpyxto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018656.0553482-1481-273358404878044/AnsiballZ_file.py'
Jan 21 18:04:16 compute-0 sudo[176826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:16 compute-0 python3.9[176828]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:16 compute-0 sudo[176826]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:04:20.058 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:04:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:04:20.060 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:04:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:04:20.060 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:04:21 compute-0 sudo[176978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocecxaqrintpmpszjuhckvnmlyhmsttn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018660.9498148-1718-272801034380928/AnsiballZ_getent.py'
Jan 21 18:04:21 compute-0 sudo[176978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:21 compute-0 python3.9[176980]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 21 18:04:21 compute-0 sudo[176978]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:22 compute-0 sudo[177131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzizoyacdcnqxnrcvzouyaryfkfcdkio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018661.836245-1734-136648675238044/AnsiballZ_group.py'
Jan 21 18:04:22 compute-0 sudo[177131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:22 compute-0 python3.9[177133]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 18:04:22 compute-0 groupadd[177134]: group added to /etc/group: name=nova, GID=42436
Jan 21 18:04:22 compute-0 groupadd[177134]: group added to /etc/gshadow: name=nova
Jan 21 18:04:22 compute-0 groupadd[177134]: new group: name=nova, GID=42436
Jan 21 18:04:22 compute-0 sudo[177131]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:23 compute-0 sudo[177289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjhuueobqrptggldyttwobydcsdqmude ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018662.8380997-1750-37132103866674/AnsiballZ_user.py'
Jan 21 18:04:23 compute-0 sudo[177289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:23 compute-0 python3.9[177291]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 18:04:23 compute-0 useradd[177293]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 21 18:04:23 compute-0 useradd[177293]: add 'nova' to group 'libvirt'
Jan 21 18:04:23 compute-0 useradd[177293]: add 'nova' to shadow group 'libvirt'
Jan 21 18:04:23 compute-0 sudo[177289]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:24 compute-0 sshd-session[177324]: Accepted publickey for zuul from 192.168.122.30 port 60404 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 18:04:24 compute-0 systemd-logind[782]: New session 26 of user zuul.
Jan 21 18:04:24 compute-0 systemd[1]: Started Session 26 of User zuul.
Jan 21 18:04:24 compute-0 sshd-session[177324]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 18:04:24 compute-0 sshd-session[177327]: Received disconnect from 192.168.122.30 port 60404:11: disconnected by user
Jan 21 18:04:24 compute-0 sshd-session[177327]: Disconnected from user zuul 192.168.122.30 port 60404
Jan 21 18:04:24 compute-0 sshd-session[177324]: pam_unix(sshd:session): session closed for user zuul
Jan 21 18:04:24 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Jan 21 18:04:24 compute-0 systemd-logind[782]: Session 26 logged out. Waiting for processes to exit.
Jan 21 18:04:24 compute-0 systemd-logind[782]: Removed session 26.
Jan 21 18:04:25 compute-0 podman[177352]: 2026-01-21 18:04:25.01758512 +0000 UTC m=+0.070595572 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 18:04:25 compute-0 podman[177394]: 2026-01-21 18:04:25.137946533 +0000 UTC m=+0.078794549 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 21 18:04:25 compute-0 python3.9[177523]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:04:25 compute-0 python3.9[177644]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018665.03048-1800-3040438426754/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:26 compute-0 python3.9[177794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:04:26 compute-0 python3.9[177870]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:27 compute-0 python3.9[178020]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:04:27 compute-0 python3.9[178141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018667.1049392-1800-207454247444711/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:28 compute-0 python3.9[178291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:04:29 compute-0 python3.9[178412]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018668.1254175-1800-157523539328834/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:29 compute-0 python3.9[178562]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:04:30 compute-0 python3.9[178683]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018669.3666594-1800-255954935225558/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:31 compute-0 python3.9[178833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:04:31 compute-0 python3.9[178954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018670.6083956-1800-73484159448011/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:32 compute-0 sudo[179104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckheyrknpfkgfucelgghythnuqjcfumy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018672.0175476-1966-141582688295640/AnsiballZ_file.py'
Jan 21 18:04:32 compute-0 sudo[179104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:32 compute-0 python3.9[179106]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:04:32 compute-0 sudo[179104]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:32 compute-0 sudo[179256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxbyyfkhyligcvfctepvtpirsxntkumh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018672.698335-1982-44253220608158/AnsiballZ_copy.py'
Jan 21 18:04:32 compute-0 sudo[179256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:33 compute-0 python3.9[179258]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:04:33 compute-0 sudo[179256]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:33 compute-0 sudo[179408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yelolesdjagqcgghzmogomypwgwavibr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018673.3845956-1998-69318271237548/AnsiballZ_stat.py'
Jan 21 18:04:33 compute-0 sudo[179408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:33 compute-0 python3.9[179410]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:04:33 compute-0 sudo[179408]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:34 compute-0 sudo[179560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clnaxflyziweauamqlkakejjniotvwpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018674.0626113-2014-126523508917020/AnsiballZ_stat.py'
Jan 21 18:04:34 compute-0 sudo[179560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:34 compute-0 python3.9[179562]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:04:34 compute-0 sudo[179560]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:34 compute-0 sudo[179683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccpgavvrvoywevtgctksgzirgyqximiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018674.0626113-2014-126523508917020/AnsiballZ_copy.py'
Jan 21 18:04:34 compute-0 sudo[179683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:34 compute-0 python3.9[179685]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769018674.0626113-2014-126523508917020/.source _original_basename=.ts26qfe4 follow=False checksum=be4e7fc979a6d92892ea63fbfe22718ecb31d8a4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 21 18:04:34 compute-0 sudo[179683]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:35 compute-0 python3.9[179837]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:04:36 compute-0 python3.9[179989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:04:37 compute-0 python3.9[180110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018676.182895-2066-34709166420709/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:37 compute-0 python3.9[180260]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:04:38 compute-0 python3.9[180381]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018677.4064758-2096-67566813736267/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:04:39 compute-0 sudo[180531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvmfyltkiviiummtwnyhlfrhjkopjqub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018678.7639534-2130-55372096315314/AnsiballZ_container_config_data.py'
Jan 21 18:04:39 compute-0 sudo[180531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:39 compute-0 python3.9[180533]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 21 18:04:39 compute-0 sudo[180531]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:40 compute-0 sudo[180683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwqtpizvmalsqdcbouavufkzuojjwyxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018679.7585938-2152-5869848833122/AnsiballZ_container_config_hash.py'
Jan 21 18:04:40 compute-0 sudo[180683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:40 compute-0 python3.9[180685]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:04:40 compute-0 sudo[180683]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:41 compute-0 sudo[180835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbsplikuomrzkdbrkzqfnmkwiyfjdqcv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018680.7501426-2172-75173869525847/AnsiballZ_edpm_container_manage.py'
Jan 21 18:04:41 compute-0 sudo[180835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:41 compute-0 python3[180837]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:04:41 compute-0 podman[180875]: 2026-01-21 18:04:41.809969718 +0000 UTC m=+0.062071818 container create 42f35023b5a9f99dad8a6c8f4bf94f3a3aacdf107e6937a63c6fd131f067ecdd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:04:41 compute-0 podman[180875]: 2026-01-21 18:04:41.774721834 +0000 UTC m=+0.026824024 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 18:04:41 compute-0 python3[180837]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 21 18:04:42 compute-0 sudo[180835]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:42 compute-0 sudo[181063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyrmigfijneykssauwezgmykdporpxmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018682.2394679-2188-80437578185550/AnsiballZ_stat.py'
Jan 21 18:04:42 compute-0 sudo[181063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:42 compute-0 python3.9[181065]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:04:42 compute-0 sudo[181063]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:43 compute-0 sudo[181217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsnkuezmtoupzgbhpooyollnnhqxecxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018683.226543-2212-217544564521393/AnsiballZ_container_config_data.py'
Jan 21 18:04:43 compute-0 sudo[181217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:43 compute-0 python3.9[181219]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 21 18:04:43 compute-0 sudo[181217]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:44 compute-0 sudo[181369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeqwzuynptfhcgnjzvstoesefadadkti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018684.1584325-2234-13105809563020/AnsiballZ_container_config_hash.py'
Jan 21 18:04:44 compute-0 sudo[181369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:44 compute-0 python3.9[181371]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:04:44 compute-0 sudo[181369]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:45 compute-0 sudo[181521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqltdkguwcwneigayqnoaorqjrlpmmyr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018685.141734-2254-116906216046877/AnsiballZ_edpm_container_manage.py'
Jan 21 18:04:45 compute-0 sudo[181521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:45 compute-0 python3[181523]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:04:45 compute-0 podman[181557]: 2026-01-21 18:04:45.84129244 +0000 UTC m=+0.047836347 container create b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=nova_compute, managed_by=edpm_ansible)
Jan 21 18:04:45 compute-0 podman[181557]: 2026-01-21 18:04:45.813083024 +0000 UTC m=+0.019626941 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 18:04:45 compute-0 python3[181523]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 21 18:04:45 compute-0 sudo[181521]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:47 compute-0 sudo[181745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzivczdfasmpbziertgwqrlwxrwphbtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018686.7755375-2270-184268473797712/AnsiballZ_stat.py'
Jan 21 18:04:47 compute-0 sudo[181745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:47 compute-0 python3.9[181747]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:04:47 compute-0 sudo[181745]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:47 compute-0 sudo[181899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdcbynqmkjklmjvapiwoyaummikasoch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018687.60662-2288-71034356343321/AnsiballZ_file.py'
Jan 21 18:04:47 compute-0 sudo[181899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:48 compute-0 python3.9[181901]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:04:48 compute-0 sudo[181899]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:48 compute-0 sudo[182052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iftimpuavbovggxajjknvshzgryngvzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018688.1515503-2288-31014633634492/AnsiballZ_copy.py'
Jan 21 18:04:48 compute-0 sudo[182052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:48 compute-0 python3.9[182054]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769018688.1515503-2288-31014633634492/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:04:48 compute-0 sudo[182052]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:48 compute-0 sshd-session[182025]: Invalid user telemetry from 64.227.98.100 port 39830
Jan 21 18:04:49 compute-0 sshd-session[182025]: Connection closed by invalid user telemetry 64.227.98.100 port 39830 [preauth]
Jan 21 18:04:49 compute-0 sudo[182128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xshsjscygllwqhkvwbyfughlcbaayfha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018688.1515503-2288-31014633634492/AnsiballZ_systemd.py'
Jan 21 18:04:49 compute-0 sudo[182128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:49 compute-0 python3.9[182130]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:04:49 compute-0 systemd[1]: Reloading.
Jan 21 18:04:49 compute-0 systemd-rc-local-generator[182158]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:04:49 compute-0 systemd-sysv-generator[182162]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:04:49 compute-0 sudo[182128]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:50 compute-0 sudo[182239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdbzaahuzrskaatmcuesspnophtykljm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018688.1515503-2288-31014633634492/AnsiballZ_systemd.py'
Jan 21 18:04:50 compute-0 sudo[182239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:50 compute-0 python3.9[182241]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:04:50 compute-0 systemd[1]: Reloading.
Jan 21 18:04:50 compute-0 systemd-sysv-generator[182273]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:04:50 compute-0 systemd-rc-local-generator[182269]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:04:50 compute-0 systemd[1]: Starting nova_compute container...
Jan 21 18:04:50 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:04:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:50 compute-0 podman[182280]: 2026-01-21 18:04:50.760096278 +0000 UTC m=+0.085495829 container init b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 18:04:50 compute-0 podman[182280]: 2026-01-21 18:04:50.772816283 +0000 UTC m=+0.098215814 container start b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:04:50 compute-0 podman[182280]: nova_compute
Jan 21 18:04:50 compute-0 nova_compute[182296]: + sudo -E kolla_set_configs
Jan 21 18:04:50 compute-0 systemd[1]: Started nova_compute container.
Jan 21 18:04:50 compute-0 sudo[182239]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Validating config file
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Copying service configuration files
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Deleting /etc/ceph
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Creating directory /etc/ceph
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /etc/ceph
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Writing out command to execute
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:04:50 compute-0 nova_compute[182296]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 18:04:50 compute-0 nova_compute[182296]: ++ cat /run_command
Jan 21 18:04:50 compute-0 nova_compute[182296]: + CMD=nova-compute
Jan 21 18:04:50 compute-0 nova_compute[182296]: + ARGS=
Jan 21 18:04:50 compute-0 nova_compute[182296]: + sudo kolla_copy_cacerts
Jan 21 18:04:50 compute-0 nova_compute[182296]: + [[ ! -n '' ]]
Jan 21 18:04:50 compute-0 nova_compute[182296]: + . kolla_extend_start
Jan 21 18:04:50 compute-0 nova_compute[182296]: Running command: 'nova-compute'
Jan 21 18:04:50 compute-0 nova_compute[182296]: + echo 'Running command: '\''nova-compute'\'''
Jan 21 18:04:50 compute-0 nova_compute[182296]: + umask 0022
Jan 21 18:04:50 compute-0 nova_compute[182296]: + exec nova-compute
Jan 21 18:04:52 compute-0 python3.9[182458]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:04:52 compute-0 nova_compute[182296]: 2026-01-21 18:04:52.806 182300 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 18:04:52 compute-0 nova_compute[182296]: 2026-01-21 18:04:52.806 182300 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 18:04:52 compute-0 nova_compute[182296]: 2026-01-21 18:04:52.806 182300 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 18:04:52 compute-0 nova_compute[182296]: 2026-01-21 18:04:52.807 182300 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 21 18:04:52 compute-0 nova_compute[182296]: 2026-01-21 18:04:52.964 182300 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:04:52 compute-0 nova_compute[182296]: 2026-01-21 18:04:52.978 182300 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:04:52 compute-0 nova_compute[182296]: 2026-01-21 18:04:52.978 182300 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 18:04:53 compute-0 python3.9[182612]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.649 182300 INFO nova.virt.driver [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.774 182300 INFO nova.compute.provider_config [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.868 182300 DEBUG oslo_concurrency.lockutils [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.869 182300 DEBUG oslo_concurrency.lockutils [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.869 182300 DEBUG oslo_concurrency.lockutils [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.869 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.869 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.869 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.870 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.870 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.870 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.870 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.870 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.871 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.871 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.871 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.871 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.871 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.872 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.872 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.872 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.872 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.872 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.872 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.873 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.873 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.873 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.873 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.873 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.873 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.874 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.874 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.874 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.874 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.874 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.874 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.874 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.875 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.875 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.875 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.875 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.875 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.875 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.876 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.876 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.876 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.876 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.876 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.876 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.876 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.877 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.877 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.877 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.877 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.877 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.877 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.877 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.878 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.878 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.878 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.878 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.878 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.879 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.879 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.879 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.879 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.879 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.879 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.879 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.880 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.880 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.880 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.880 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.880 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.880 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.880 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.881 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.881 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.881 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.881 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.881 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.881 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.881 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.882 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.882 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.882 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.882 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.882 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.882 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.883 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.883 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.883 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.883 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.883 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.883 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.883 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.884 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.884 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.884 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.884 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.884 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.884 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.884 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.885 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.885 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.885 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.885 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.885 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.885 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.885 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.886 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.886 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.886 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.886 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.886 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.886 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.886 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.886 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.887 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.887 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.887 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.887 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.887 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.887 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.887 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.888 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.888 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.888 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.888 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.888 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.888 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.888 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.889 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.889 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.889 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.889 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.889 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.889 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.889 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.889 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.890 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.890 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.890 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.890 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.890 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.890 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.890 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.891 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.891 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.891 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.891 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.891 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.891 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.891 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.892 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.892 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.892 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.892 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.892 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.892 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.893 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.893 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.893 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.893 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.893 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.893 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.893 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.894 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.894 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.894 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.894 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.894 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.894 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.894 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.895 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.895 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.895 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.895 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.895 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.895 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.895 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.896 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.896 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.896 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.896 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.896 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.896 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.896 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.897 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.897 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.897 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.897 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.897 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.897 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.897 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.898 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.898 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.898 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.898 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.898 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.898 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.898 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.899 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.899 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.899 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.899 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.899 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.899 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.899 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.900 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.900 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.900 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.900 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.900 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.900 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.900 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.901 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.901 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.901 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.901 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.901 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.901 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.901 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.902 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.902 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.902 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.902 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.902 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.902 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.902 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.902 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.903 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.903 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.903 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.903 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.903 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.903 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.903 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.904 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.904 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.904 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.904 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.904 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.904 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.904 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.905 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.905 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.905 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.905 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.905 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.905 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.905 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.906 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.906 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.906 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.906 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.906 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.906 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.906 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.907 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.907 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.907 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.907 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.907 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.907 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.907 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.908 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.908 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.908 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.908 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.908 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.908 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.908 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.909 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.909 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.909 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.909 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.909 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.909 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.910 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.910 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.910 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.910 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.910 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.910 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.910 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.910 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.911 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.911 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.911 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.911 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.911 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.911 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.911 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.912 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.912 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.912 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.912 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.912 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.912 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.912 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.913 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.913 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.913 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.913 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.913 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.913 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.913 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.914 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.914 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.914 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.914 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.914 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.914 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.914 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.915 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.915 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.915 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.915 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.915 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.915 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.916 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.916 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.916 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.916 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.916 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.916 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.916 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.916 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.917 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.917 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.917 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.917 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.917 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.917 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.917 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.918 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.918 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.918 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.918 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.918 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.918 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.918 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.919 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.919 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.919 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.919 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.919 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.919 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.919 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.920 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.920 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.920 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.920 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.920 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.920 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.920 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.921 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.921 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.921 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.921 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.921 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.922 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.922 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.922 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.922 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.922 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.922 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.922 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.923 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.923 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.923 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.923 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.923 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.923 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.923 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.923 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.924 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.924 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.924 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.924 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.924 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.924 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.924 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.925 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.925 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.925 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.925 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.925 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.925 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.925 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.926 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.926 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.926 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.926 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.926 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.926 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.926 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.927 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.927 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.927 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.927 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.927 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.927 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.928 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.928 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.928 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.928 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.928 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.928 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.928 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.929 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.929 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.929 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.929 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.929 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.929 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.929 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.930 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.930 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.930 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.930 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.930 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.930 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.930 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.931 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.931 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.931 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.931 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.931 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.931 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.931 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.931 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.932 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.932 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.932 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.932 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.932 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.932 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.932 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.933 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.933 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.933 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.933 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.933 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.933 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.933 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.934 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.934 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.934 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.934 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.934 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.934 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.935 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.935 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.935 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.935 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.935 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.935 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.936 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.936 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.936 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.936 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.936 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.936 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.936 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.937 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.937 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.937 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.937 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.937 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.937 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.938 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.938 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.938 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.938 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.938 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.938 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.939 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.939 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.939 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.939 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.939 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.939 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.939 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.940 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.940 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.940 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.940 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.940 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.940 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.940 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.940 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.941 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.941 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.941 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.941 182300 WARNING oslo_config.cfg [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 21 18:04:53 compute-0 nova_compute[182296]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 21 18:04:53 compute-0 nova_compute[182296]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 21 18:04:53 compute-0 nova_compute[182296]: and ``live_migration_inbound_addr`` respectively.
Jan 21 18:04:53 compute-0 nova_compute[182296]: ).  Its value may be silently ignored in the future.
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.941 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.942 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.942 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.942 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.942 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.942 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.942 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.943 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.943 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.943 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.943 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.943 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.943 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.943 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.944 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.944 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.944 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.944 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.944 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.944 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.944 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.945 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.945 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.945 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.945 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.945 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.945 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.945 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.946 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.946 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.946 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.946 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.946 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.946 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.946 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.947 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.947 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.947 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.947 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.947 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.947 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.948 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.948 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.948 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.948 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.948 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.948 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.949 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.949 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.949 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.949 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.949 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.950 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.950 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.950 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.950 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.950 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.950 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.951 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.951 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.951 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.951 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.951 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.951 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.952 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.952 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.952 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.952 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.952 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.952 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.952 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.953 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.953 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.953 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.953 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.953 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.953 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.953 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.954 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.954 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.954 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.954 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.954 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.954 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.955 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.955 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.955 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.955 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.955 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.955 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.956 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.956 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.956 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.956 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.956 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.956 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.956 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.957 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.957 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.957 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.957 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.957 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.957 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.957 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.958 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.958 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.958 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.958 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.958 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.958 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.958 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.959 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.959 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.959 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.959 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.959 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.959 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.959 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.960 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.960 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.960 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.960 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.960 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.960 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.960 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.961 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.961 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.961 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.961 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.961 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.961 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.962 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.962 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.962 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.962 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.962 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.962 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.962 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.963 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.963 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.963 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.963 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.963 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.964 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.964 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.964 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.964 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.964 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.964 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.964 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.965 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.965 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.965 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.965 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.965 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.965 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.966 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.966 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.966 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.966 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.966 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.966 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.966 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.967 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.967 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.967 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.967 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.967 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.967 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.967 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.968 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.968 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.968 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.968 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.968 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.968 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.969 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.969 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.969 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.969 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.969 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.969 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.969 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.970 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.970 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.970 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.970 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.970 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.970 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.971 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.971 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.971 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.971 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.971 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.971 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.971 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.972 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.972 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.972 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.972 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.972 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.972 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.973 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.973 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.973 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.973 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.973 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.973 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.974 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.974 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.974 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.974 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.974 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.974 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.974 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.975 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.975 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.975 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.975 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.975 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.975 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.975 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.976 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.976 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.976 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.976 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.976 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.976 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.976 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.977 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.977 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.977 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.977 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.977 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.977 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.977 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.977 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.978 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.978 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.978 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.978 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.978 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.978 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.978 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.979 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.979 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.979 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.979 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.979 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.979 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.980 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.980 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.980 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.980 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.980 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.980 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.980 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.981 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.981 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.981 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.981 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.981 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.981 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.981 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.982 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.982 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.982 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.982 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.982 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.982 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.982 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.983 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.983 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.983 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.983 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.983 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.983 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.983 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.984 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.984 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.984 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.984 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.984 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.984 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.984 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.985 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.985 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.985 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.985 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.985 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.985 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.986 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.986 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.986 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.986 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.986 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.986 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.987 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.987 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.987 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.987 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.987 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.988 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.988 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.988 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.988 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.988 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.989 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.989 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.989 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.989 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.989 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.990 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.990 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.990 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.990 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.990 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.990 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.991 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.991 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.991 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.991 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.991 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.991 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.992 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.992 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.992 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.992 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.992 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.992 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.993 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.993 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.993 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.993 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.993 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.993 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.994 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.994 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.994 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.994 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.994 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.995 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.995 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.995 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.995 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.995 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.996 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.996 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.996 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.996 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.996 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.996 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.996 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.997 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.997 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.997 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.997 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.997 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.997 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.998 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.998 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.998 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.998 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.998 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.999 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.999 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.999 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:53 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.999 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:53.999 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.000 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.000 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.000 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.000 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.000 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.000 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.001 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.001 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.001 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.001 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.001 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.001 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.002 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.002 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.002 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.002 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.002 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.002 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.002 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.003 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.003 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.003 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.003 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.003 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.003 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.003 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.004 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.004 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.004 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.004 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.004 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.004 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.005 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.005 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.005 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.005 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.005 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.005 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.005 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.006 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.006 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.006 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.006 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.006 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.006 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.006 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.007 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.007 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.007 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.007 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.007 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.007 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.007 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.008 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.008 182300 DEBUG oslo_service.service [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.009 182300 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.024 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.025 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.025 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.026 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 21 18:04:54 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 21 18:04:54 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.093 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fc42fa32e20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.096 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fc42fa32e20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.098 182300 INFO nova.virt.libvirt.driver [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Connection event '1' reason 'None'
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.112 182300 WARNING nova.virt.libvirt.driver [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.113 182300 DEBUG nova.virt.libvirt.volume.mount [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 21 18:04:54 compute-0 python3.9[182814]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.909 182300 INFO nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Libvirt host capabilities <capabilities>
Jan 21 18:04:54 compute-0 nova_compute[182296]: 
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <host>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <uuid>3193fe30-ac0a-415e-b2b2-55df2b1703a4</uuid>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <cpu>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <arch>x86_64</arch>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model>EPYC-Rome-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <vendor>AMD</vendor>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <microcode version='16777317'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <signature family='23' model='49' stepping='0'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='x2apic'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='tsc-deadline'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='osxsave'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='hypervisor'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='tsc_adjust'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='spec-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='stibp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='arch-capabilities'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='ssbd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='cmp_legacy'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='topoext'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='virt-ssbd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='lbrv'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='tsc-scale'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='vmcb-clean'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='pause-filter'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='pfthreshold'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='svme-addr-chk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='rdctl-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='skip-l1dfl-vmentry'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='mds-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature name='pschange-mc-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <pages unit='KiB' size='4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <pages unit='KiB' size='2048'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <pages unit='KiB' size='1048576'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </cpu>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <power_management>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <suspend_mem/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <suspend_disk/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <suspend_hybrid/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </power_management>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <iommu support='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <migration_features>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <live/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <uri_transports>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <uri_transport>tcp</uri_transport>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <uri_transport>rdma</uri_transport>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </uri_transports>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </migration_features>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <topology>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <cells num='1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <cell id='0'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:           <memory unit='KiB'>7864316</memory>
Jan 21 18:04:54 compute-0 nova_compute[182296]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 21 18:04:54 compute-0 nova_compute[182296]:           <pages unit='KiB' size='2048'>0</pages>
Jan 21 18:04:54 compute-0 nova_compute[182296]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 21 18:04:54 compute-0 nova_compute[182296]:           <distances>
Jan 21 18:04:54 compute-0 nova_compute[182296]:             <sibling id='0' value='10'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:           </distances>
Jan 21 18:04:54 compute-0 nova_compute[182296]:           <cpus num='8'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:           </cpus>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         </cell>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </cells>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </topology>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <cache>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </cache>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <secmodel>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model>selinux</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <doi>0</doi>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </secmodel>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <secmodel>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model>dac</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <doi>0</doi>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </secmodel>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   </host>
Jan 21 18:04:54 compute-0 nova_compute[182296]: 
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <guest>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <os_type>hvm</os_type>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <arch name='i686'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <wordsize>32</wordsize>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <domain type='qemu'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <domain type='kvm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </arch>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <features>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <pae/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <nonpae/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <acpi default='on' toggle='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <apic default='on' toggle='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <cpuselection/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <deviceboot/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <disksnapshot default='on' toggle='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <externalSnapshot/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </features>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   </guest>
Jan 21 18:04:54 compute-0 nova_compute[182296]: 
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <guest>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <os_type>hvm</os_type>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <arch name='x86_64'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <wordsize>64</wordsize>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <domain type='qemu'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <domain type='kvm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </arch>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <features>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <acpi default='on' toggle='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <apic default='on' toggle='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <cpuselection/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <deviceboot/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <disksnapshot default='on' toggle='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <externalSnapshot/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </features>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   </guest>
Jan 21 18:04:54 compute-0 nova_compute[182296]: 
Jan 21 18:04:54 compute-0 nova_compute[182296]: </capabilities>
Jan 21 18:04:54 compute-0 nova_compute[182296]: 
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.918 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.935 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 21 18:04:54 compute-0 nova_compute[182296]: <domainCapabilities>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <domain>kvm</domain>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <arch>i686</arch>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <vcpu max='240'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <iothreads supported='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <os supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <enum name='firmware'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <loader supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>rom</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>pflash</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='readonly'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>yes</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>no</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='secure'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>no</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </loader>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   </os>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <cpu>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <mode name='host-passthrough' supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='hostPassthroughMigratable'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>on</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>off</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <mode name='maximum' supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='maximumMigratable'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>on</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>off</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <mode name='host-model' supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <vendor>AMD</vendor>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='x2apic'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='hypervisor'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='stibp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='ssbd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='overflow-recov'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='succor'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='ibrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='lbrv'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc-scale'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='flushbyasid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='pause-filter'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='pfthreshold'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='disable' name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <mode name='custom' supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-noTSX'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='ClearwaterForest'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bhi-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ddpd-u'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sha512'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sm3'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sm4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='ClearwaterForest-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bhi-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ddpd-u'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sha512'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sm3'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sm4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cooperlake'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cooperlake-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Cooperlake-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Denverton'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Denverton-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Denverton-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Denverton-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Dhyana-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Turin'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibpb-brtype'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='prefetchi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbpb'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-Turin-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibpb-brtype'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='prefetchi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbpb'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-v4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='EPYC-v5'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx10'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx10-128'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx10-256'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx10-512'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx10'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx10-128'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx10-256'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx10-512'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Haswell'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Haswell-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Haswell-noTSX'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Haswell-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Haswell-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Haswell-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Haswell-v4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v5'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v6'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v7'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='IvyBridge'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='KnightsMill'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-4fmaps'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-4vnniw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512er'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512pf'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='KnightsMill-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-4fmaps'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-4vnniw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512er'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512pf'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Opteron_G4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Opteron_G4-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Opteron_G5'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tbm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Opteron_G5-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tbm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='SierraForest'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v5'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Snowridge'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v2'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v3'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v4'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='athlon'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='athlon-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='core2duo'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='core2duo-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='coreduo'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='coreduo-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='n270'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='n270-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='phenom'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='phenom-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   </cpu>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <memoryBacking supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <enum name='sourceType'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <value>file</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <value>anonymous</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <value>memfd</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   </memoryBacking>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <devices>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <disk supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='diskDevice'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>disk</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>cdrom</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>floppy</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>lun</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='bus'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>ide</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>fdc</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>scsi</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>sata</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>virtio-transitional</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>virtio-non-transitional</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </disk>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <graphics supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>vnc</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>egl-headless</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>dbus</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </graphics>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <video supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='modelType'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>vga</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>cirrus</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>none</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>bochs</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>ramfb</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </video>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <hostdev supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='mode'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>subsystem</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='startupPolicy'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>default</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>mandatory</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>requisite</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>optional</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='subsysType'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>pci</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>scsi</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='capsType'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='pciBackend'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </hostdev>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <rng supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>virtio-transitional</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>virtio-non-transitional</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>random</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>egd</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>builtin</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </rng>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <filesystem supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='driverType'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>path</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>handle</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>virtiofs</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </filesystem>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <tpm supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>tpm-tis</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>tpm-crb</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>emulator</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>external</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='backendVersion'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>2.0</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </tpm>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <redirdev supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='bus'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </redirdev>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <channel supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>pty</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>unix</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </channel>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <crypto supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='model'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>qemu</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>builtin</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </crypto>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <interface supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='backendType'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>default</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>passt</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </interface>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <panic supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>isa</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>hyperv</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </panic>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <console supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>null</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>vc</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>pty</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>dev</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>file</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>pipe</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>stdio</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>udp</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>tcp</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>unix</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>qemu-vdagent</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>dbus</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </console>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   </devices>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <features>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <gic supported='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <vmcoreinfo supported='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <genid supported='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <backingStoreInput supported='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <backup supported='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <async-teardown supported='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <s390-pv supported='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <ps2 supported='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <tdx supported='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <sev supported='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <sgx supported='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <hyperv supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='features'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>relaxed</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>vapic</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>spinlocks</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>vpindex</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>runtime</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>synic</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>stimer</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>reset</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>vendor_id</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>frequencies</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>reenlightenment</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>tlbflush</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>ipi</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>avic</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>emsr_bitmap</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>xmm_input</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <defaults>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <spinlocks>4095</spinlocks>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <stimer_direct>on</stimer_direct>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </defaults>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </hyperv>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <launchSecurity supported='no'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   </features>
Jan 21 18:04:54 compute-0 nova_compute[182296]: </domainCapabilities>
Jan 21 18:04:54 compute-0 nova_compute[182296]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 18:04:54 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.941 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 21 18:04:54 compute-0 nova_compute[182296]: <domainCapabilities>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <domain>kvm</domain>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <arch>i686</arch>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <vcpu max='4096'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <iothreads supported='yes'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <os supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <enum name='firmware'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <loader supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>rom</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>pflash</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='readonly'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>yes</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>no</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='secure'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>no</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </loader>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   </os>
Jan 21 18:04:54 compute-0 nova_compute[182296]:   <cpu>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <mode name='host-passthrough' supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='hostPassthroughMigratable'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>on</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>off</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <mode name='maximum' supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <enum name='maximumMigratable'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>on</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <value>off</value>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <mode name='host-model' supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <vendor>AMD</vendor>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='x2apic'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='hypervisor'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='stibp'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='ssbd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='overflow-recov'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='succor'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='ibrs'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='lbrv'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc-scale'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='flushbyasid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='pause-filter'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='pfthreshold'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <feature policy='disable' name='xsaves'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:54 compute-0 nova_compute[182296]:     <mode name='custom' supported='yes'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-noTSX'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:04:54 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v1'>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:54 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='ClearwaterForest'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ddpd-u'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sha512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm3'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='ClearwaterForest-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ddpd-u'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sha512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm3'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cooperlake'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cooperlake-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cooperlake-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Dhyana-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Turin'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibpb-brtype'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbpb'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Turin-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibpb-brtype'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbpb'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-128'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-256'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-128'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-256'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v6'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v7'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='KnightsMill'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4fmaps'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4vnniw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512er'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512pf'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='KnightsMill-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4fmaps'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4vnniw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512er'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512pf'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G4-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tbm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G5-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tbm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='athlon'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='athlon-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='core2duo'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='core2duo-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='coreduo'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='coreduo-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='n270'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='n270-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='phenom'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='phenom-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </cpu>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <memoryBacking supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <enum name='sourceType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>file</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>anonymous</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>memfd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </memoryBacking>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <devices>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <disk supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='diskDevice'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>disk</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>cdrom</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>floppy</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>lun</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='bus'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>fdc</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>scsi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>sata</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-non-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </disk>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <graphics supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vnc</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>egl-headless</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>dbus</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </graphics>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <video supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='modelType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vga</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>cirrus</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>none</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>bochs</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>ramfb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </video>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <hostdev supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='mode'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>subsystem</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='startupPolicy'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>default</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>mandatory</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>requisite</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>optional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='subsysType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pci</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>scsi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='capsType'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='pciBackend'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </hostdev>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <rng supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-non-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>random</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>egd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>builtin</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </rng>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <filesystem supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='driverType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>path</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>handle</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtiofs</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </filesystem>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <tpm supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tpm-tis</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tpm-crb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>emulator</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>external</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendVersion'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>2.0</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </tpm>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <redirdev supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='bus'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </redirdev>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <channel supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pty</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>unix</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </channel>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <crypto supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>qemu</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>builtin</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </crypto>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <interface supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>default</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>passt</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </interface>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <panic supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>isa</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>hyperv</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </panic>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <console supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>null</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vc</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pty</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>dev</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>file</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pipe</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>stdio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>udp</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tcp</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>unix</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>qemu-vdagent</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>dbus</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </console>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </devices>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <features>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <gic supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <vmcoreinfo supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <genid supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <backingStoreInput supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <backup supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <async-teardown supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <s390-pv supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <ps2 supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <tdx supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <sev supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <sgx supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <hyperv supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='features'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>relaxed</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vapic</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>spinlocks</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vpindex</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>runtime</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>synic</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>stimer</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>reset</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vendor_id</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>frequencies</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>reenlightenment</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tlbflush</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>ipi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>avic</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>emsr_bitmap</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>xmm_input</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <defaults>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <spinlocks>4095</spinlocks>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <stimer_direct>on</stimer_direct>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </defaults>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </hyperv>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <launchSecurity supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </features>
Jan 21 18:04:55 compute-0 nova_compute[182296]: </domainCapabilities>
Jan 21 18:04:55 compute-0 nova_compute[182296]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:54.999 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.003 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 21 18:04:55 compute-0 nova_compute[182296]: <domainCapabilities>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <domain>kvm</domain>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <arch>x86_64</arch>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <vcpu max='240'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <iothreads supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <os supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <enum name='firmware'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <loader supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>rom</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pflash</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='readonly'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>yes</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>no</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='secure'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>no</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </loader>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </os>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <cpu>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <mode name='host-passthrough' supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='hostPassthroughMigratable'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>on</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>off</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <mode name='maximum' supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='maximumMigratable'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>on</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>off</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <mode name='host-model' supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <vendor>AMD</vendor>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='x2apic'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='hypervisor'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='stibp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='ssbd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='overflow-recov'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='succor'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='lbrv'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc-scale'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='flushbyasid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='pause-filter'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='pfthreshold'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='disable' name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <mode name='custom' supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='ClearwaterForest'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ddpd-u'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sha512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm3'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='ClearwaterForest-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ddpd-u'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sha512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm3'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cooperlake'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cooperlake-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cooperlake-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Dhyana-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Turin'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibpb-brtype'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbpb'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Turin-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibpb-brtype'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbpb'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-128'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-256'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-128'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-256'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v6'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v7'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='KnightsMill'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4fmaps'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4vnniw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512er'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512pf'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='KnightsMill-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4fmaps'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4vnniw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512er'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512pf'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G4-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tbm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G5-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tbm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='athlon'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='athlon-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='core2duo'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='core2duo-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='coreduo'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='coreduo-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='n270'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='n270-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='phenom'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='phenom-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </cpu>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <memoryBacking supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <enum name='sourceType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>file</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>anonymous</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>memfd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </memoryBacking>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <devices>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <disk supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='diskDevice'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>disk</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>cdrom</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>floppy</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>lun</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='bus'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>ide</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>fdc</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>scsi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>sata</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-non-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </disk>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <graphics supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vnc</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>egl-headless</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>dbus</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </graphics>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <video supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='modelType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vga</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>cirrus</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>none</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>bochs</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>ramfb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </video>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <hostdev supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='mode'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>subsystem</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='startupPolicy'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>default</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>mandatory</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>requisite</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>optional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='subsysType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pci</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>scsi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='capsType'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='pciBackend'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </hostdev>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <rng supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-non-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>random</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>egd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>builtin</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </rng>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <filesystem supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='driverType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>path</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>handle</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtiofs</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </filesystem>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <tpm supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tpm-tis</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tpm-crb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>emulator</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>external</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendVersion'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>2.0</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </tpm>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <redirdev supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='bus'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </redirdev>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <channel supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pty</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>unix</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </channel>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <crypto supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>qemu</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>builtin</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </crypto>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <interface supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>default</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>passt</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </interface>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <panic supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>isa</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>hyperv</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </panic>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <console supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>null</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vc</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pty</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>dev</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>file</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pipe</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>stdio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>udp</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tcp</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>unix</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>qemu-vdagent</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>dbus</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </console>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </devices>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <features>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <gic supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <vmcoreinfo supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <genid supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <backingStoreInput supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <backup supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <async-teardown supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <s390-pv supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <ps2 supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <tdx supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <sev supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <sgx supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <hyperv supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='features'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>relaxed</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vapic</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>spinlocks</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vpindex</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>runtime</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>synic</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>stimer</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>reset</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vendor_id</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>frequencies</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>reenlightenment</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tlbflush</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>ipi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>avic</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>emsr_bitmap</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>xmm_input</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <defaults>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <spinlocks>4095</spinlocks>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <stimer_direct>on</stimer_direct>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </defaults>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </hyperv>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <launchSecurity supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </features>
Jan 21 18:04:55 compute-0 nova_compute[182296]: </domainCapabilities>
Jan 21 18:04:55 compute-0 nova_compute[182296]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.086 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 21 18:04:55 compute-0 nova_compute[182296]: <domainCapabilities>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <domain>kvm</domain>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <arch>x86_64</arch>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <vcpu max='4096'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <iothreads supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <os supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <enum name='firmware'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>efi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <loader supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>rom</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pflash</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='readonly'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>yes</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>no</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='secure'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>yes</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>no</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </loader>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </os>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <cpu>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <mode name='host-passthrough' supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='hostPassthroughMigratable'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>on</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>off</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <mode name='maximum' supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='maximumMigratable'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>on</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>off</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <mode name='host-model' supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <vendor>AMD</vendor>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='x2apic'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='hypervisor'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='stibp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='ssbd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='overflow-recov'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='succor'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='lbrv'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='tsc-scale'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='flushbyasid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='pause-filter'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='pfthreshold'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <feature policy='disable' name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <mode name='custom' supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Broadwell-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='ClearwaterForest'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ddpd-u'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sha512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm3'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='ClearwaterForest-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ddpd-u'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sha512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm3'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sm4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cooperlake'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cooperlake-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Cooperlake-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Denverton-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Dhyana-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Milan-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Rome-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Turin'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibpb-brtype'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbpb'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-Turin-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amd-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='auto-ibrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibpb-brtype'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='no-nested-data-bp'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='null-sel-clr-base'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='perfmon-v2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbpb'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='stibp-always-on'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='EPYC-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-128'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-256'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='GraniteRapids-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-128'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-256'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx10-512'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='prefetchiti'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Haswell-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v6'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Icelake-Server-v7'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='IvyBridge-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='KnightsMill'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4fmaps'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4vnniw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512er'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512pf'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='KnightsMill-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4fmaps'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-4vnniw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512er'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512pf'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G4-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tbm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Opteron_G5-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fma4'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tbm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xop'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SapphireRapids-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='amx-tile'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-bf16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-fp16'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bitalg'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vbmi2'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrc'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fzrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='la57'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='taa-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='tsx-ldtrk'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='SierraForest-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ifma'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-ne-convert'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx-vnni-int8'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bhi-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='bus-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cmpccxadd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fbsdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='fsrs'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ibrs-all'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='intel-psfd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ipred-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='lam'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mcdt-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pbrsb-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='psdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rrsba-ctrl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='serialize'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vaes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='vpclmulqdq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Client-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='hle'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='rtm'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Skylake-Server-v5'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512bw'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512cd'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512dq'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512f'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='avx512vl'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='invpcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pcid'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='pku'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='mpx'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v2'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v3'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='core-capability'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='split-lock-detect'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='Snowridge-v4'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='cldemote'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='erms'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='gfni'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdir64b'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='movdiri'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='xsaves'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='athlon'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='athlon-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='core2duo'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='core2duo-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='coreduo'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='coreduo-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='n270'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='n270-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='ss'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='phenom'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <blockers model='phenom-v1'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnow'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <feature name='3dnowext'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </blockers>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </mode>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </cpu>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <memoryBacking supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <enum name='sourceType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>file</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>anonymous</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <value>memfd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </memoryBacking>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <devices>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <disk supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='diskDevice'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>disk</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>cdrom</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>floppy</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>lun</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='bus'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>fdc</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>scsi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>sata</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-non-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </disk>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <graphics supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vnc</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>egl-headless</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>dbus</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </graphics>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <video supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='modelType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vga</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>cirrus</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>none</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>bochs</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>ramfb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </video>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <hostdev supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='mode'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>subsystem</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='startupPolicy'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>default</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>mandatory</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>requisite</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>optional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='subsysType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pci</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>scsi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='capsType'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='pciBackend'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </hostdev>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <rng supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtio-non-transitional</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>random</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>egd</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>builtin</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </rng>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <filesystem supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='driverType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>path</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>handle</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>virtiofs</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </filesystem>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <tpm supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tpm-tis</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tpm-crb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>emulator</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>external</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendVersion'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>2.0</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </tpm>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <redirdev supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='bus'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>usb</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </redirdev>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <channel supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pty</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>unix</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </channel>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <crypto supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>qemu</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendModel'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>builtin</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </crypto>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <interface supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='backendType'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>default</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>passt</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </interface>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <panic supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='model'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>isa</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>hyperv</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </panic>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <console supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='type'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>null</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vc</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pty</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>dev</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>file</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>pipe</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>stdio</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>udp</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tcp</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>unix</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>qemu-vdagent</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>dbus</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </console>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </devices>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <features>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <gic supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <vmcoreinfo supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <genid supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <backingStoreInput supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <backup supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <async-teardown supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <s390-pv supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <ps2 supported='yes'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <tdx supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <sev supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <sgx supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <hyperv supported='yes'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <enum name='features'>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>relaxed</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vapic</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>spinlocks</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vpindex</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>runtime</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>synic</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>stimer</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>reset</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>vendor_id</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>frequencies</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>reenlightenment</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>tlbflush</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>ipi</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>avic</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>emsr_bitmap</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <value>xmm_input</value>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </enum>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       <defaults>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <spinlocks>4095</spinlocks>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <stimer_direct>on</stimer_direct>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:04:55 compute-0 nova_compute[182296]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:04:55 compute-0 nova_compute[182296]:       </defaults>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     </hyperv>
Jan 21 18:04:55 compute-0 nova_compute[182296]:     <launchSecurity supported='no'/>
Jan 21 18:04:55 compute-0 nova_compute[182296]:   </features>
Jan 21 18:04:55 compute-0 nova_compute[182296]: </domainCapabilities>
Jan 21 18:04:55 compute-0 nova_compute[182296]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.172 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.173 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.173 182300 DEBUG nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.182 182300 INFO nova.virt.libvirt.host [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Secure Boot support detected
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.184 182300 INFO nova.virt.libvirt.driver [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.184 182300 INFO nova.virt.libvirt.driver [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.196 182300 DEBUG nova.virt.libvirt.driver [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 21 18:04:55 compute-0 nova_compute[182296]:   <model>Nehalem</model>
Jan 21 18:04:55 compute-0 nova_compute[182296]: </cpu>
Jan 21 18:04:55 compute-0 nova_compute[182296]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.199 182300 DEBUG nova.virt.libvirt.driver [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 21 18:04:55 compute-0 sudo[183000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oopjopscfpfymkbwluiphovdicqyjtgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018694.7700725-2408-95816849900474/AnsiballZ_podman_container.py'
Jan 21 18:04:55 compute-0 sudo[183000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:55 compute-0 podman[182951]: 2026-01-21 18:04:55.381213986 +0000 UTC m=+0.089256478 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:04:55 compute-0 podman[182950]: 2026-01-21 18:04:55.420797694 +0000 UTC m=+0.130625099 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 21 18:04:55 compute-0 python3.9[183015]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 21 18:04:55 compute-0 sudo[183000]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.920 182300 INFO nova.virt.node [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Determined node identity 502e4243-611b-433d-a766-9b485d51652d from /var/lib/nova/compute_id
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.942 182300 WARNING nova.compute.manager [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Compute nodes ['502e4243-611b-433d-a766-9b485d51652d'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 21 18:04:55 compute-0 nova_compute[182296]: 2026-01-21 18:04:55.978 182300 INFO nova.compute.manager [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 21 18:04:56 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.074 182300 WARNING nova.compute.manager [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.074 182300 DEBUG oslo_concurrency.lockutils [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.075 182300 DEBUG oslo_concurrency.lockutils [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.075 182300 DEBUG oslo_concurrency.lockutils [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.075 182300 DEBUG nova.compute.resource_tracker [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:04:56 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 21 18:04:56 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 21 18:04:56 compute-0 sudo[183218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wraizfvogrgksvocpdyekbitvdojorys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018695.9796376-2424-266086653319284/AnsiballZ_systemd.py'
Jan 21 18:04:56 compute-0 sudo[183218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.409 182300 WARNING nova.virt.libvirt.driver [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.410 182300 DEBUG nova.compute.resource_tracker [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6186MB free_disk=73.58443450927734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.411 182300 DEBUG oslo_concurrency.lockutils [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.411 182300 DEBUG oslo_concurrency.lockutils [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.427 182300 WARNING nova.compute.resource_tracker [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] No compute node record for compute-0.ctlplane.example.com:502e4243-611b-433d-a766-9b485d51652d: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 502e4243-611b-433d-a766-9b485d51652d could not be found.
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.448 182300 INFO nova.compute.resource_tracker [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 502e4243-611b-433d-a766-9b485d51652d
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.520 182300 DEBUG nova.compute.resource_tracker [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.520 182300 DEBUG nova.compute.resource_tracker [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:04:56 compute-0 python3.9[183220]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:04:56 compute-0 systemd[1]: Stopping nova_compute container...
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.745 182300 DEBUG oslo_concurrency.lockutils [None req-29ba64e2-34eb-42ca-86e4-9856bcb4b348 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.746 182300 DEBUG oslo_concurrency.lockutils [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.746 182300 DEBUG oslo_concurrency.lockutils [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:04:56 compute-0 nova_compute[182296]: 2026-01-21 18:04:56.746 182300 DEBUG oslo_concurrency.lockutils [None req-f6c12df1-24d6-4817-9ff1-deb470b3c73b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:04:57 compute-0 virtqemud[182681]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 21 18:04:57 compute-0 virtqemud[182681]: hostname: compute-0
Jan 21 18:04:57 compute-0 virtqemud[182681]: End of file while reading data: Input/output error
Jan 21 18:04:57 compute-0 systemd[1]: libpod-b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f.scope: Deactivated successfully.
Jan 21 18:04:57 compute-0 systemd[1]: libpod-b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f.scope: Consumed 3.260s CPU time.
Jan 21 18:04:57 compute-0 podman[183224]: 2026-01-21 18:04:57.255438276 +0000 UTC m=+0.557655228 container died b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible)
Jan 21 18:04:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f-userdata-shm.mount: Deactivated successfully.
Jan 21 18:04:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9-merged.mount: Deactivated successfully.
Jan 21 18:04:57 compute-0 podman[183224]: 2026-01-21 18:04:57.330243477 +0000 UTC m=+0.632460409 container cleanup b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:04:57 compute-0 podman[183224]: nova_compute
Jan 21 18:04:57 compute-0 podman[183251]: nova_compute
Jan 21 18:04:57 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 21 18:04:57 compute-0 systemd[1]: Stopped nova_compute container.
Jan 21 18:04:57 compute-0 systemd[1]: Starting nova_compute container...
Jan 21 18:04:57 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:04:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a7c1c5294b4767ed1c439179e42fc3f67ebe451304dea8b0b429adc65bd1db9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:57 compute-0 podman[183265]: 2026-01-21 18:04:57.510038563 +0000 UTC m=+0.095277143 container init b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Jan 21 18:04:57 compute-0 podman[183265]: 2026-01-21 18:04:57.517105032 +0000 UTC m=+0.102343602 container start b597cf4015a6e9d1d3d88ee855ec80c3443a363fe8697a0cf71d96d17aa2226f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 18:04:57 compute-0 podman[183265]: nova_compute
Jan 21 18:04:57 compute-0 nova_compute[183278]: + sudo -E kolla_set_configs
Jan 21 18:04:57 compute-0 systemd[1]: Started nova_compute container.
Jan 21 18:04:57 compute-0 sudo[183218]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Validating config file
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Copying service configuration files
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Deleting /etc/ceph
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Creating directory /etc/ceph
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /etc/ceph
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Writing out command to execute
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:04:57 compute-0 nova_compute[183278]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 18:04:57 compute-0 nova_compute[183278]: ++ cat /run_command
Jan 21 18:04:57 compute-0 nova_compute[183278]: + CMD=nova-compute
Jan 21 18:04:57 compute-0 nova_compute[183278]: + ARGS=
Jan 21 18:04:57 compute-0 nova_compute[183278]: + sudo kolla_copy_cacerts
Jan 21 18:04:57 compute-0 nova_compute[183278]: + [[ ! -n '' ]]
Jan 21 18:04:57 compute-0 nova_compute[183278]: + . kolla_extend_start
Jan 21 18:04:57 compute-0 nova_compute[183278]: Running command: 'nova-compute'
Jan 21 18:04:57 compute-0 nova_compute[183278]: + echo 'Running command: '\''nova-compute'\'''
Jan 21 18:04:57 compute-0 nova_compute[183278]: + umask 0022
Jan 21 18:04:57 compute-0 nova_compute[183278]: + exec nova-compute
Jan 21 18:04:58 compute-0 sudo[183442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-strbanzgodznkmvamkjfdoepwhnkjznm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018698.23857-2442-66686267414790/AnsiballZ_podman_container.py'
Jan 21 18:04:58 compute-0 sudo[183442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:04:58 compute-0 python3.9[183444]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 21 18:04:58 compute-0 systemd[1]: Started libpod-conmon-42f35023b5a9f99dad8a6c8f4bf94f3a3aacdf107e6937a63c6fd131f067ecdd.scope.
Jan 21 18:04:59 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:04:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4355d499f821e20e818d9b2f8f3aa3c64d8cc305722370b4fc115f649924a46a/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4355d499f821e20e818d9b2f8f3aa3c64d8cc305722370b4fc115f649924a46a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4355d499f821e20e818d9b2f8f3aa3c64d8cc305722370b4fc115f649924a46a/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 21 18:04:59 compute-0 podman[183468]: 2026-01-21 18:04:59.038787667 +0000 UTC m=+0.113851537 container init 42f35023b5a9f99dad8a6c8f4bf94f3a3aacdf107e6937a63c6fd131f067ecdd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:04:59 compute-0 podman[183468]: 2026-01-21 18:04:59.044604376 +0000 UTC m=+0.119668216 container start 42f35023b5a9f99dad8a6c8f4bf94f3a3aacdf107e6937a63c6fd131f067ecdd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:04:59 compute-0 python3.9[183444]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Applying nova statedir ownership
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 21 18:04:59 compute-0 nova_compute_init[183490]: INFO:nova_statedir:Nova statedir ownership complete
Jan 21 18:04:59 compute-0 systemd[1]: libpod-42f35023b5a9f99dad8a6c8f4bf94f3a3aacdf107e6937a63c6fd131f067ecdd.scope: Deactivated successfully.
Jan 21 18:04:59 compute-0 podman[183503]: 2026-01-21 18:04:59.133164698 +0000 UTC m=+0.025709047 container died 42f35023b5a9f99dad8a6c8f4bf94f3a3aacdf107e6937a63c6fd131f067ecdd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3)
Jan 21 18:04:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-42f35023b5a9f99dad8a6c8f4bf94f3a3aacdf107e6937a63c6fd131f067ecdd-userdata-shm.mount: Deactivated successfully.
Jan 21 18:04:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-4355d499f821e20e818d9b2f8f3aa3c64d8cc305722370b4fc115f649924a46a-merged.mount: Deactivated successfully.
Jan 21 18:04:59 compute-0 podman[183503]: 2026-01-21 18:04:59.170549583 +0000 UTC m=+0.063093912 container cleanup 42f35023b5a9f99dad8a6c8f4bf94f3a3aacdf107e6937a63c6fd131f067ecdd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:04:59 compute-0 sudo[183442]: pam_unix(sudo:session): session closed for user root
Jan 21 18:04:59 compute-0 systemd[1]: libpod-conmon-42f35023b5a9f99dad8a6c8f4bf94f3a3aacdf107e6937a63c6fd131f067ecdd.scope: Deactivated successfully.
Jan 21 18:04:59 compute-0 nova_compute[183278]: 2026-01-21 18:04:59.619 183284 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 18:04:59 compute-0 nova_compute[183278]: 2026-01-21 18:04:59.619 183284 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 18:04:59 compute-0 nova_compute[183278]: 2026-01-21 18:04:59.620 183284 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 18:04:59 compute-0 nova_compute[183278]: 2026-01-21 18:04:59.620 183284 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 21 18:04:59 compute-0 sshd-session[160186]: Connection closed by 192.168.122.30 port 53706
Jan 21 18:04:59 compute-0 nova_compute[183278]: 2026-01-21 18:04:59.776 183284 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:04:59 compute-0 sshd-session[160183]: pam_unix(sshd:session): session closed for user zuul
Jan 21 18:04:59 compute-0 nova_compute[183278]: 2026-01-21 18:04:59.806 183284 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:04:59 compute-0 nova_compute[183278]: 2026-01-21 18:04:59.806 183284 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 18:04:59 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Jan 21 18:04:59 compute-0 systemd[1]: session-25.scope: Consumed 1min 31.829s CPU time.
Jan 21 18:04:59 compute-0 systemd-logind[782]: Session 25 logged out. Waiting for processes to exit.
Jan 21 18:04:59 compute-0 systemd-logind[782]: Removed session 25.
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.350 183284 INFO nova.virt.driver [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.474 183284 INFO nova.compute.provider_config [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.819 183284 DEBUG oslo_concurrency.lockutils [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.819 183284 DEBUG oslo_concurrency.lockutils [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.820 183284 DEBUG oslo_concurrency.lockutils [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.820 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.821 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.821 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.821 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.821 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.821 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.822 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.822 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.822 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.822 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.822 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.823 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.823 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.823 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.823 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.824 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.824 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.824 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.824 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.824 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.824 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.825 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.825 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.825 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.825 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.825 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.826 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.826 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.826 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.826 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.827 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.827 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.827 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.827 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.827 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.828 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.828 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.828 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.828 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.828 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.829 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.829 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.829 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.829 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.830 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.830 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.830 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.830 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.830 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.831 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.831 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.831 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.831 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.831 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.832 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.832 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.832 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.832 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.832 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.832 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.833 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.833 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.833 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.833 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.833 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.834 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.834 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.834 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.834 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.834 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.835 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.835 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.835 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.835 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.836 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.836 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.836 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.836 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.836 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.837 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.837 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.837 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.837 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.837 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.838 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.838 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.838 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.838 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.838 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.839 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.839 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.839 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.839 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.840 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.840 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.840 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.840 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.841 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.841 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.841 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.842 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.842 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.842 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.842 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.842 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.843 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.843 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.843 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.843 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.844 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.844 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.844 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.844 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.845 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.845 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.845 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.845 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.846 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.846 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.846 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.846 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.846 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.847 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.847 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.847 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.847 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.847 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.848 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.848 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.848 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.848 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.848 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.849 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.849 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.849 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.849 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.849 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.850 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.850 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.850 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.850 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.850 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.851 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.851 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.851 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.851 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.851 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.852 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.852 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.852 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.852 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.853 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.853 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.853 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.853 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.854 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.854 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.854 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.854 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.854 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.855 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.855 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.855 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.855 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.855 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.856 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.856 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.856 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.856 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.856 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.857 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.857 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.857 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.857 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.858 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.858 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.858 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.858 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.858 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.859 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.859 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.859 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.859 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.859 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.860 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.860 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.860 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.860 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.861 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.861 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.861 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.862 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.862 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.862 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.862 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.863 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.863 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.863 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.863 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.864 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.864 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.864 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.864 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.865 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.865 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.865 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.866 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.866 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.866 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.866 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.867 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.867 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.867 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.868 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.868 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.868 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.869 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.869 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.869 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.870 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.870 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.870 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.870 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.871 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.871 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.871 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.872 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.872 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.872 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.873 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.873 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.873 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.873 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.874 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.874 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.874 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.874 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.875 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.875 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.875 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.876 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.876 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.876 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.876 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.877 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.877 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.877 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.878 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.878 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.878 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.879 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.879 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.879 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.879 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.880 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.880 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.880 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.881 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.881 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.881 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.882 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.882 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.882 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.882 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.883 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.883 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.883 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.884 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.884 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.884 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.885 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.885 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.885 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.886 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.886 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.886 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.886 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.887 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.887 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.887 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.888 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.888 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.888 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.889 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.889 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.889 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.889 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.890 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.890 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.890 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.890 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.891 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.891 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.891 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.891 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.891 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.892 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.892 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.892 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.892 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.892 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.893 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.893 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.893 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.893 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.893 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.894 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.894 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.894 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.894 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.894 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.895 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.895 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.895 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.895 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.895 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.895 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.896 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.896 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.896 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.896 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.896 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.896 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.897 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.897 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.897 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.897 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.897 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.898 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.898 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.898 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.898 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.898 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.899 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.899 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.899 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.899 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.899 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.900 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.900 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.900 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.900 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.900 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.900 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.901 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.901 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.901 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.901 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.902 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.902 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.902 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.902 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.902 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.903 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.903 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.903 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.903 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.904 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.904 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.904 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.904 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.904 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.905 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.905 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.905 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.905 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.905 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.906 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.906 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.906 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.906 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.906 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.906 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.907 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.907 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.907 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.907 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.907 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.908 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.908 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.908 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.908 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.908 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.908 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.909 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.909 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.909 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.909 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.909 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.909 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.910 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.910 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.910 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.910 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.910 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.910 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.910 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.911 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.911 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.911 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.911 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.911 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.912 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.912 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.912 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.912 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.912 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.912 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.913 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.913 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.913 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.913 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.913 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.913 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.913 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.914 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.914 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.914 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.914 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.914 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.914 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.914 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.915 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.915 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.915 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.915 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.915 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.915 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.915 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.916 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.916 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.916 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.916 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.916 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.916 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.916 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.917 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.917 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.917 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.917 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.917 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.917 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.917 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.918 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.918 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.918 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.918 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.918 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.918 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.918 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.918 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.919 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.919 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.919 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.919 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.919 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.919 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.920 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.920 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.920 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.920 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.920 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.920 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.921 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.921 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.921 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.921 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.921 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.921 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.921 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.922 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.922 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.922 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.922 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.922 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.922 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.923 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.923 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.923 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.923 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.923 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.923 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.923 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.924 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.924 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.924 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.924 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.924 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.924 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.924 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.925 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.925 183284 WARNING oslo_config.cfg [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 21 18:05:00 compute-0 nova_compute[183278]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 21 18:05:00 compute-0 nova_compute[183278]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 21 18:05:00 compute-0 nova_compute[183278]: and ``live_migration_inbound_addr`` respectively.
Jan 21 18:05:00 compute-0 nova_compute[183278]: ).  Its value may be silently ignored in the future.
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.925 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.925 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.925 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.926 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.926 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.926 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.926 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.926 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.926 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.926 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.927 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.927 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.927 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.927 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.927 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.927 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.928 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.928 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.928 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.928 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.928 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.928 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.928 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.929 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.929 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.929 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.929 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.929 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.929 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.929 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.930 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.930 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.930 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.930 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.930 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.930 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.931 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.931 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.931 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.931 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.931 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.931 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.931 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.932 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.932 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.932 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.932 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.932 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.932 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.932 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.933 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.933 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.933 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.933 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.933 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.933 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.934 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.934 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.934 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.934 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.934 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.934 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.934 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.935 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.935 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.935 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.935 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.935 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.935 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.935 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.935 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.936 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.936 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.936 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.936 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.936 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.936 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.936 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.937 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.937 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.937 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.937 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.937 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.937 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.938 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.938 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.938 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.938 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.938 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.938 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.938 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.938 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.939 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.939 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.939 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.939 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.939 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.939 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.939 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.940 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.940 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.940 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.940 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.940 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.940 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.940 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.940 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.941 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.941 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.941 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.941 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.941 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.941 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.941 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.942 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.942 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.942 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.942 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.942 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.942 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.942 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.943 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.943 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.943 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.943 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.943 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.943 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.943 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.944 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.944 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.944 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.944 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.944 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.944 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.944 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.944 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.945 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.945 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.945 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.945 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.945 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.945 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.946 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.946 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.946 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.946 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.946 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.946 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.946 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.947 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.947 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.947 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.947 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.947 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.947 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.947 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.948 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.948 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.948 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.948 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.948 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.948 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.949 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.949 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.949 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.949 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.949 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.949 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.949 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.949 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.950 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.950 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.950 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.950 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.950 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.950 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.950 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.951 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.951 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.951 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.951 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.951 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.951 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.952 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.952 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.952 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.952 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.952 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.952 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.952 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.953 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.953 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.953 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.953 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.953 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.953 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.953 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.954 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.954 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.954 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.954 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.954 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.954 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.955 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.955 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.955 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.955 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.955 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.955 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.955 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.956 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.956 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.956 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.956 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.956 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.956 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.956 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.957 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.957 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.957 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.957 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.957 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.957 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.957 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.958 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.958 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.958 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.958 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.958 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.958 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.958 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.958 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.959 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.959 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.959 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.959 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.959 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.959 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.960 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.960 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.960 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.960 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.960 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.960 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.960 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.960 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.961 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.961 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.961 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.961 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.961 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.961 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.962 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.962 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.962 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.962 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.962 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.962 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.962 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.963 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.963 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.963 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.963 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.963 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.963 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.963 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.964 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.964 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.964 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.964 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.964 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.964 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.964 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.965 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.965 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.965 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.965 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.965 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.965 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.965 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.966 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.966 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.966 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.966 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.966 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.966 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.966 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.967 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.967 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.967 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.967 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.967 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.967 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.967 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.968 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.968 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.968 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.968 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.968 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.968 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.968 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.969 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.969 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.969 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.969 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.969 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.969 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.969 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.970 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.970 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.970 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.970 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.970 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.970 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.970 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.971 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.971 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.971 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.971 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.971 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.971 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.971 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.972 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.972 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.972 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.972 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.972 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.972 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.973 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.973 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.973 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.973 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.973 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.973 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.973 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.974 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.974 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.974 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.974 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.974 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.974 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.974 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.975 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.975 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.975 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.975 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.975 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.975 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.975 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.976 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.976 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.976 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.976 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.976 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.976 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.976 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.977 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.977 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.977 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.977 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.977 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.977 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.977 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.978 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.978 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.978 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.978 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.978 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.978 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.978 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.979 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.979 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.979 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.979 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.979 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.979 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.979 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.980 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.980 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.980 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.980 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.980 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.980 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.980 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.981 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.981 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.981 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.981 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.981 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.981 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.981 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.982 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.982 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.982 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.982 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.982 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.982 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.982 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.983 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.983 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.983 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.983 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.983 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.983 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.983 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.984 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.984 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.984 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.984 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.984 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.984 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.985 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.985 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.985 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.985 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.985 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.985 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.986 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.986 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.986 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.986 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.986 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.986 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.986 183284 DEBUG oslo_service.service [None req-6b8ad7a3-cc8f-4c2a-8371-134d95bf7dfa - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 18:05:00 compute-0 nova_compute[183278]: 2026-01-21 18:05:00.987 183284 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.013 183284 INFO nova.virt.node [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Determined node identity 502e4243-611b-433d-a766-9b485d51652d from /var/lib/nova/compute_id
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.013 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.014 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.014 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.014 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.031 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f0d5531dbe0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.033 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f0d5531dbe0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.034 183284 INFO nova.virt.libvirt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Connection event '1' reason 'None'
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.044 183284 INFO nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Libvirt host capabilities <capabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]: 
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <host>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <uuid>3193fe30-ac0a-415e-b2b2-55df2b1703a4</uuid>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <arch>x86_64</arch>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model>EPYC-Rome-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <vendor>AMD</vendor>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <microcode version='16777317'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <signature family='23' model='49' stepping='0'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='x2apic'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='tsc-deadline'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='osxsave'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='hypervisor'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='tsc_adjust'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='spec-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='stibp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='arch-capabilities'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='cmp_legacy'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='topoext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='virt-ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='lbrv'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='tsc-scale'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='vmcb-clean'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='pause-filter'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='pfthreshold'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='svme-addr-chk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='rdctl-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='skip-l1dfl-vmentry'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='mds-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature name='pschange-mc-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <pages unit='KiB' size='4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <pages unit='KiB' size='2048'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <pages unit='KiB' size='1048576'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <power_management>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <suspend_mem/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <suspend_disk/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <suspend_hybrid/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </power_management>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <iommu support='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <migration_features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <live/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <uri_transports>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <uri_transport>tcp</uri_transport>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <uri_transport>rdma</uri_transport>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </uri_transports>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </migration_features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <topology>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <cells num='1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <cell id='0'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:           <memory unit='KiB'>7864316</memory>
Jan 21 18:05:01 compute-0 nova_compute[183278]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 21 18:05:01 compute-0 nova_compute[183278]:           <pages unit='KiB' size='2048'>0</pages>
Jan 21 18:05:01 compute-0 nova_compute[183278]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 21 18:05:01 compute-0 nova_compute[183278]:           <distances>
Jan 21 18:05:01 compute-0 nova_compute[183278]:             <sibling id='0' value='10'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:           </distances>
Jan 21 18:05:01 compute-0 nova_compute[183278]:           <cpus num='8'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:           </cpus>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         </cell>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </cells>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </topology>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <cache>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </cache>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <secmodel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model>selinux</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <doi>0</doi>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </secmodel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <secmodel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model>dac</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <doi>0</doi>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </secmodel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </host>
Jan 21 18:05:01 compute-0 nova_compute[183278]: 
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <guest>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <os_type>hvm</os_type>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <arch name='i686'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <wordsize>32</wordsize>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <domain type='qemu'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <domain type='kvm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </arch>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <pae/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <nonpae/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <acpi default='on' toggle='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <apic default='on' toggle='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <cpuselection/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <deviceboot/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <disksnapshot default='on' toggle='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <externalSnapshot/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </guest>
Jan 21 18:05:01 compute-0 nova_compute[183278]: 
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <guest>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <os_type>hvm</os_type>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <arch name='x86_64'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <wordsize>64</wordsize>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <domain type='qemu'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <domain type='kvm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </arch>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <acpi default='on' toggle='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <apic default='on' toggle='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <cpuselection/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <deviceboot/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <disksnapshot default='on' toggle='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <externalSnapshot/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </guest>
Jan 21 18:05:01 compute-0 nova_compute[183278]: 
Jan 21 18:05:01 compute-0 nova_compute[183278]: </capabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]: 
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.046 183284 DEBUG nova.virt.libvirt.volume.mount [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.052 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.056 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 21 18:05:01 compute-0 nova_compute[183278]: <domainCapabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <domain>kvm</domain>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <arch>i686</arch>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <vcpu max='4096'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <iothreads supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <os supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <enum name='firmware'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <loader supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>rom</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pflash</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='readonly'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>yes</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>no</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='secure'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>no</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </loader>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </os>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='host-passthrough' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='hostPassthroughMigratable'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>on</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>off</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='maximum' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='maximumMigratable'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>on</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>off</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='host-model' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <vendor>AMD</vendor>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='x2apic'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='hypervisor'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='stibp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='overflow-recov'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='succor'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='lbrv'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc-scale'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='flushbyasid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='pause-filter'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='pfthreshold'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='disable' name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='custom' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='ClearwaterForest'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ddpd-u'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sha512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='ClearwaterForest-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ddpd-u'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sha512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Dhyana-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Turin'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibpb-brtype'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbpb'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Turin-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibpb-brtype'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbpb'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-128'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-256'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-128'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-256'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v6'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v7'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='KnightsMill'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4fmaps'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4vnniw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512er'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512pf'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='KnightsMill-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4fmaps'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4vnniw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512er'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512pf'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G4-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tbm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G5-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tbm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='athlon'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='athlon-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='core2duo'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='core2duo-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='coreduo'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='coreduo-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='n270'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='n270-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='phenom'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='phenom-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <memoryBacking supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <enum name='sourceType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>file</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>anonymous</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>memfd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </memoryBacking>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <disk supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='diskDevice'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>disk</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>cdrom</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>floppy</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>lun</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='bus'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>fdc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>scsi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>sata</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-non-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <graphics supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vnc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>egl-headless</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dbus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </graphics>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <video supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='modelType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vga</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>cirrus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>none</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>bochs</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ramfb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </video>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <hostdev supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='mode'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>subsystem</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='startupPolicy'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>default</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>mandatory</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>requisite</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>optional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='subsysType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pci</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>scsi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='capsType'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='pciBackend'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </hostdev>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <rng supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-non-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>random</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>egd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>builtin</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <filesystem supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='driverType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>path</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>handle</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtiofs</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </filesystem>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <tpm supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tpm-tis</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tpm-crb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>emulator</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>external</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendVersion'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>2.0</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </tpm>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <redirdev supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='bus'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </redirdev>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <channel supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pty</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>unix</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </channel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <crypto supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>qemu</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>builtin</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </crypto>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <interface supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>default</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>passt</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <panic supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>isa</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>hyperv</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </panic>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <console supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>null</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pty</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dev</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>file</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pipe</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>stdio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>udp</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tcp</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>unix</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>qemu-vdagent</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dbus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </console>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <gic supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <vmcoreinfo supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <genid supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <backingStoreInput supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <backup supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <async-teardown supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <s390-pv supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <ps2 supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <tdx supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <sev supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <sgx supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <hyperv supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='features'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>relaxed</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vapic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>spinlocks</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vpindex</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>runtime</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>synic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>stimer</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>reset</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vendor_id</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>frequencies</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>reenlightenment</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tlbflush</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ipi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>avic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>emsr_bitmap</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>xmm_input</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <defaults>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <spinlocks>4095</spinlocks>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <stimer_direct>on</stimer_direct>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </defaults>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </hyperv>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <launchSecurity supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </features>
Jan 21 18:05:01 compute-0 nova_compute[183278]: </domainCapabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.064 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 21 18:05:01 compute-0 nova_compute[183278]: <domainCapabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <domain>kvm</domain>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <arch>i686</arch>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <vcpu max='240'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <iothreads supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <os supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <enum name='firmware'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <loader supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>rom</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pflash</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='readonly'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>yes</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>no</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='secure'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>no</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </loader>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </os>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='host-passthrough' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='hostPassthroughMigratable'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>on</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>off</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='maximum' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='maximumMigratable'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>on</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>off</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='host-model' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <vendor>AMD</vendor>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='x2apic'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='hypervisor'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='stibp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='overflow-recov'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='succor'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='lbrv'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc-scale'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='flushbyasid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='pause-filter'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='pfthreshold'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='disable' name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='custom' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='ClearwaterForest'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ddpd-u'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sha512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='ClearwaterForest-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ddpd-u'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sha512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Dhyana-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Turin'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibpb-brtype'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbpb'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Turin-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibpb-brtype'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbpb'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-128'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-256'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-128'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-256'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v6'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v7'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='KnightsMill'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4fmaps'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4vnniw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512er'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512pf'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='KnightsMill-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4fmaps'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4vnniw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512er'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512pf'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G4-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tbm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G5-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tbm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='athlon'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='athlon-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='core2duo'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='core2duo-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='coreduo'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='coreduo-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='n270'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='n270-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='phenom'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='phenom-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <memoryBacking supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <enum name='sourceType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>file</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>anonymous</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>memfd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </memoryBacking>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <disk supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='diskDevice'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>disk</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>cdrom</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>floppy</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>lun</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='bus'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ide</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>fdc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>scsi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>sata</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-non-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <graphics supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vnc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>egl-headless</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dbus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </graphics>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <video supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='modelType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vga</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>cirrus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>none</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>bochs</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ramfb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </video>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <hostdev supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='mode'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>subsystem</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='startupPolicy'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>default</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>mandatory</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>requisite</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>optional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='subsysType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pci</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>scsi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='capsType'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='pciBackend'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </hostdev>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <rng supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-non-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>random</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>egd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>builtin</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <filesystem supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='driverType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>path</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>handle</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtiofs</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </filesystem>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <tpm supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tpm-tis</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tpm-crb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>emulator</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>external</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendVersion'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>2.0</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </tpm>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <redirdev supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='bus'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </redirdev>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <channel supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pty</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>unix</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </channel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <crypto supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>qemu</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>builtin</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </crypto>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <interface supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>default</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>passt</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <panic supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>isa</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>hyperv</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </panic>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <console supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>null</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pty</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dev</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>file</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pipe</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>stdio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>udp</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tcp</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>unix</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>qemu-vdagent</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dbus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </console>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <gic supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <vmcoreinfo supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <genid supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <backingStoreInput supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <backup supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <async-teardown supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <s390-pv supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <ps2 supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <tdx supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <sev supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <sgx supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <hyperv supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='features'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>relaxed</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vapic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>spinlocks</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vpindex</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>runtime</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>synic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>stimer</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>reset</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vendor_id</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>frequencies</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>reenlightenment</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tlbflush</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ipi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>avic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>emsr_bitmap</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>xmm_input</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <defaults>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <spinlocks>4095</spinlocks>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <stimer_direct>on</stimer_direct>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </defaults>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </hyperv>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <launchSecurity supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </features>
Jan 21 18:05:01 compute-0 nova_compute[183278]: </domainCapabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.130 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.136 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 21 18:05:01 compute-0 nova_compute[183278]: <domainCapabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <domain>kvm</domain>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <arch>x86_64</arch>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <vcpu max='4096'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <iothreads supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <os supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <enum name='firmware'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>efi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <loader supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>rom</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pflash</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='readonly'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>yes</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>no</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='secure'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>yes</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>no</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </loader>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </os>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='host-passthrough' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='hostPassthroughMigratable'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>on</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>off</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='maximum' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='maximumMigratable'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>on</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>off</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='host-model' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <vendor>AMD</vendor>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='x2apic'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='hypervisor'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='stibp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='overflow-recov'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='succor'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='lbrv'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc-scale'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='flushbyasid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='pause-filter'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='pfthreshold'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='disable' name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='custom' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='ClearwaterForest'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ddpd-u'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sha512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='ClearwaterForest-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ddpd-u'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sha512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Dhyana-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Turin'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibpb-brtype'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbpb'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Turin-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibpb-brtype'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbpb'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-128'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-256'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-128'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-256'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v6'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v7'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='KnightsMill'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4fmaps'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4vnniw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512er'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512pf'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='KnightsMill-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4fmaps'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4vnniw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512er'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512pf'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G4-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tbm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G5-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tbm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='athlon'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='athlon-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='core2duo'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='core2duo-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='coreduo'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='coreduo-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='n270'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='n270-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='phenom'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='phenom-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <memoryBacking supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <enum name='sourceType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>file</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>anonymous</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>memfd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </memoryBacking>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <disk supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='diskDevice'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>disk</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>cdrom</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>floppy</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>lun</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='bus'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>fdc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>scsi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>sata</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-non-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <graphics supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vnc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>egl-headless</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dbus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </graphics>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <video supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='modelType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vga</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>cirrus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>none</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>bochs</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ramfb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </video>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <hostdev supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='mode'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>subsystem</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='startupPolicy'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>default</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>mandatory</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>requisite</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>optional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='subsysType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pci</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>scsi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='capsType'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='pciBackend'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </hostdev>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <rng supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-non-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>random</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>egd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>builtin</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <filesystem supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='driverType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>path</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>handle</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtiofs</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </filesystem>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <tpm supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tpm-tis</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tpm-crb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>emulator</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>external</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendVersion'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>2.0</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </tpm>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <redirdev supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='bus'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </redirdev>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <channel supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pty</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>unix</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </channel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <crypto supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>qemu</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>builtin</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </crypto>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <interface supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>default</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>passt</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <panic supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>isa</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>hyperv</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </panic>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <console supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>null</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pty</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dev</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>file</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pipe</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>stdio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>udp</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tcp</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>unix</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>qemu-vdagent</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dbus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </console>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <gic supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <vmcoreinfo supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <genid supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <backingStoreInput supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <backup supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <async-teardown supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <s390-pv supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <ps2 supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <tdx supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <sev supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <sgx supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <hyperv supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='features'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>relaxed</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vapic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>spinlocks</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vpindex</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>runtime</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>synic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>stimer</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>reset</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vendor_id</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>frequencies</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>reenlightenment</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tlbflush</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ipi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>avic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>emsr_bitmap</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>xmm_input</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <defaults>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <spinlocks>4095</spinlocks>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <stimer_direct>on</stimer_direct>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </defaults>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </hyperv>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <launchSecurity supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </features>
Jan 21 18:05:01 compute-0 nova_compute[183278]: </domainCapabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.208 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 21 18:05:01 compute-0 nova_compute[183278]: <domainCapabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <domain>kvm</domain>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <arch>x86_64</arch>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <vcpu max='240'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <iothreads supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <os supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <enum name='firmware'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <loader supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>rom</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pflash</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='readonly'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>yes</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>no</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='secure'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>no</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </loader>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </os>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='host-passthrough' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='hostPassthroughMigratable'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>on</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>off</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='maximum' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='maximumMigratable'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>on</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>off</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='host-model' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <vendor>AMD</vendor>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='x2apic'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='hypervisor'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='stibp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='overflow-recov'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='succor'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='lbrv'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='tsc-scale'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='flushbyasid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='pause-filter'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='pfthreshold'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <feature policy='disable' name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <mode name='custom' supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Broadwell-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='ClearwaterForest'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ddpd-u'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sha512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='ClearwaterForest-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ddpd-u'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sha512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm3'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sm4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Cooperlake-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Denverton-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Dhyana-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Milan-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Rome-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Turin'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibpb-brtype'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbpb'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-Turin-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amd-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='auto-ibrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vp2intersect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fs-gs-base-ns'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibpb-brtype'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='no-nested-data-bp'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='null-sel-clr-base'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='perfmon-v2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbpb'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='srso-user-kernel-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='stibp-always-on'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='EPYC-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-128'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-256'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='GraniteRapids-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-128'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-256'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx10-512'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='prefetchiti'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Haswell-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v6'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Icelake-Server-v7'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='IvyBridge-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='KnightsMill'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4fmaps'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4vnniw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512er'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512pf'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='KnightsMill-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4fmaps'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-4vnniw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512er'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512pf'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G4-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tbm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Opteron_G5-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fma4'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tbm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xop'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SapphireRapids-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='amx-tile'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-bf16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-fp16'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512-vpopcntdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bitalg'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vbmi2'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrc'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fzrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='la57'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='taa-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='tsx-ldtrk'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='SierraForest-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ifma'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-ne-convert'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx-vnni-int8'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bhi-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='bus-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cmpccxadd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fbsdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='fsrs'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ibrs-all'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='intel-psfd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ipred-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='lam'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mcdt-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pbrsb-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='psdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rrsba-ctrl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='sbdr-ssdp-no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='serialize'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vaes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='vpclmulqdq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Client-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='hle'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='rtm'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Skylake-Server-v5'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512bw'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512cd'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512dq'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512f'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='avx512vl'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='invpcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pcid'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='pku'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='mpx'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v2'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v3'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='core-capability'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='split-lock-detect'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='Snowridge-v4'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='cldemote'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='erms'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='gfni'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdir64b'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='movdiri'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='xsaves'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='athlon'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='athlon-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='core2duo'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='core2duo-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='coreduo'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='coreduo-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='n270'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='n270-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='ss'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='phenom'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <blockers model='phenom-v1'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnow'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <feature name='3dnowext'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </blockers>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </mode>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <memoryBacking supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <enum name='sourceType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>file</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>anonymous</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <value>memfd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </memoryBacking>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <disk supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='diskDevice'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>disk</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>cdrom</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>floppy</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>lun</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='bus'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ide</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>fdc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>scsi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>sata</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-non-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <graphics supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vnc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>egl-headless</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dbus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </graphics>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <video supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='modelType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vga</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>cirrus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>none</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>bochs</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ramfb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </video>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <hostdev supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='mode'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>subsystem</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='startupPolicy'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>default</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>mandatory</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>requisite</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>optional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='subsysType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pci</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>scsi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='capsType'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='pciBackend'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </hostdev>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <rng supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtio-non-transitional</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>random</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>egd</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>builtin</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <filesystem supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='driverType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>path</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>handle</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>virtiofs</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </filesystem>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <tpm supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tpm-tis</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tpm-crb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>emulator</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>external</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendVersion'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>2.0</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </tpm>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <redirdev supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='bus'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>usb</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </redirdev>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <channel supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pty</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>unix</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </channel>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <crypto supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>qemu</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendModel'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>builtin</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </crypto>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <interface supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='backendType'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>default</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>passt</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <panic supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='model'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>isa</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>hyperv</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </panic>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <console supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='type'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>null</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vc</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pty</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dev</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>file</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>pipe</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>stdio</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>udp</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tcp</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>unix</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>qemu-vdagent</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>dbus</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </console>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <features>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <gic supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <vmcoreinfo supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <genid supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <backingStoreInput supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <backup supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <async-teardown supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <s390-pv supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <ps2 supported='yes'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <tdx supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <sev supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <sgx supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <hyperv supported='yes'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <enum name='features'>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>relaxed</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vapic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>spinlocks</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vpindex</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>runtime</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>synic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>stimer</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>reset</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>vendor_id</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>frequencies</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>reenlightenment</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>tlbflush</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>ipi</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>avic</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>emsr_bitmap</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <value>xmm_input</value>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </enum>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       <defaults>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <spinlocks>4095</spinlocks>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <stimer_direct>on</stimer_direct>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:05:01 compute-0 nova_compute[183278]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:05:01 compute-0 nova_compute[183278]:       </defaults>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     </hyperv>
Jan 21 18:05:01 compute-0 nova_compute[183278]:     <launchSecurity supported='no'/>
Jan 21 18:05:01 compute-0 nova_compute[183278]:   </features>
Jan 21 18:05:01 compute-0 nova_compute[183278]: </domainCapabilities>
Jan 21 18:05:01 compute-0 nova_compute[183278]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.277 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.278 183284 INFO nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Secure Boot support detected
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.280 183284 INFO nova.virt.libvirt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.280 183284 INFO nova.virt.libvirt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.290 183284 DEBUG nova.virt.libvirt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] cpu compare xml: <cpu match="exact">
Jan 21 18:05:01 compute-0 nova_compute[183278]:   <model>Nehalem</model>
Jan 21 18:05:01 compute-0 nova_compute[183278]: </cpu>
Jan 21 18:05:01 compute-0 nova_compute[183278]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.293 183284 DEBUG nova.virt.libvirt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.315 183284 INFO nova.virt.node [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Determined node identity 502e4243-611b-433d-a766-9b485d51652d from /var/lib/nova/compute_id
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.333 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Verified node 502e4243-611b-433d-a766-9b485d51652d matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.380 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.863 183284 ERROR nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Could not retrieve compute node resource provider 502e4243-611b-433d-a766-9b485d51652d and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 502e4243-611b-433d-a766-9b485d51652d: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '502e4243-611b-433d-a766-9b485d51652d' not found: No resource provider with uuid 502e4243-611b-433d-a766-9b485d51652d found  ", "request_id": "req-0704985f-3999-44e6-9f54-d3d08048f133"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 502e4243-611b-433d-a766-9b485d51652d: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '502e4243-611b-433d-a766-9b485d51652d' not found: No resource provider with uuid 502e4243-611b-433d-a766-9b485d51652d found  ", "request_id": "req-0704985f-3999-44e6-9f54-d3d08048f133"}]}
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.887 183284 DEBUG oslo_concurrency.lockutils [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.887 183284 DEBUG oslo_concurrency.lockutils [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.887 183284 DEBUG oslo_concurrency.lockutils [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:05:01 compute-0 nova_compute[183278]: 2026-01-21 18:05:01.887 183284 DEBUG nova.compute.resource_tracker [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.034 183284 WARNING nova.virt.libvirt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.035 183284 DEBUG nova.compute.resource_tracker [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6179MB free_disk=73.58218765258789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.035 183284 DEBUG oslo_concurrency.lockutils [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.036 183284 DEBUG oslo_concurrency.lockutils [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.146 183284 ERROR nova.compute.resource_tracker [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 502e4243-611b-433d-a766-9b485d51652d: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '502e4243-611b-433d-a766-9b485d51652d' not found: No resource provider with uuid 502e4243-611b-433d-a766-9b485d51652d found  ", "request_id": "req-591eb120-8138-4794-8d40-13ade6d8c334"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 502e4243-611b-433d-a766-9b485d51652d: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '502e4243-611b-433d-a766-9b485d51652d' not found: No resource provider with uuid 502e4243-611b-433d-a766-9b485d51652d found  ", "request_id": "req-591eb120-8138-4794-8d40-13ade6d8c334"}]}
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.147 183284 DEBUG nova.compute.resource_tracker [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.147 183284 DEBUG nova.compute.resource_tracker [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.271 183284 INFO nova.scheduler.client.report [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [req-cb0292f9-907b-41b1-ad2e-dd382d49b8a1] Created resource provider record via placement API for resource provider with UUID 502e4243-611b-433d-a766-9b485d51652d and name compute-0.ctlplane.example.com.
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.306 183284 DEBUG nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 21 18:05:02 compute-0 nova_compute[183278]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.306 183284 INFO nova.virt.libvirt.host [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] kernel doesn't support AMD SEV
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.308 183284 DEBUG nova.compute.provider_tree [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.309 183284 DEBUG nova.virt.libvirt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.313 183284 DEBUG nova.virt.libvirt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Libvirt baseline CPU <cpu>
Jan 21 18:05:02 compute-0 nova_compute[183278]:   <arch>x86_64</arch>
Jan 21 18:05:02 compute-0 nova_compute[183278]:   <model>Nehalem</model>
Jan 21 18:05:02 compute-0 nova_compute[183278]:   <vendor>AMD</vendor>
Jan 21 18:05:02 compute-0 nova_compute[183278]:   <topology sockets="8" cores="1" threads="1"/>
Jan 21 18:05:02 compute-0 nova_compute[183278]: </cpu>
Jan 21 18:05:02 compute-0 nova_compute[183278]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.449 183284 DEBUG nova.scheduler.client.report [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Updated inventory for provider 502e4243-611b-433d-a766-9b485d51652d with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.449 183284 DEBUG nova.compute.provider_tree [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Updating resource provider 502e4243-611b-433d-a766-9b485d51652d generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.450 183284 DEBUG nova.compute.provider_tree [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.578 183284 DEBUG nova.compute.provider_tree [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Updating resource provider 502e4243-611b-433d-a766-9b485d51652d generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.603 183284 DEBUG nova.compute.resource_tracker [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.603 183284 DEBUG oslo_concurrency.lockutils [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.604 183284 DEBUG nova.service [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.704 183284 DEBUG nova.service [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 21 18:05:02 compute-0 nova_compute[183278]: 2026-01-21 18:05:02.704 183284 DEBUG nova.servicegroup.drivers.db [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 21 18:05:05 compute-0 sshd-session[183578]: Accepted publickey for zuul from 192.168.122.30 port 45112 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 18:05:05 compute-0 systemd-logind[782]: New session 27 of user zuul.
Jan 21 18:05:05 compute-0 systemd[1]: Started Session 27 of User zuul.
Jan 21 18:05:05 compute-0 sshd-session[183578]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 18:05:06 compute-0 python3.9[183731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:05:07 compute-0 sudo[183885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usqbwohyfgpblhfqybyncgnmmixbktva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018707.0771751-47-22006876289173/AnsiballZ_systemd_service.py'
Jan 21 18:05:07 compute-0 sudo[183885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:08 compute-0 python3.9[183887]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:05:08 compute-0 systemd[1]: Reloading.
Jan 21 18:05:08 compute-0 systemd-sysv-generator[183917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:05:08 compute-0 systemd-rc-local-generator[183914]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:05:08 compute-0 sudo[183885]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:09 compute-0 python3.9[184072]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:05:09 compute-0 network[184089]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:05:09 compute-0 network[184090]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:05:09 compute-0 network[184091]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:05:13 compute-0 sudo[184361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuwgtuejylnedjtotjwpjqccdftvwgmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018713.4312286-85-209092927878471/AnsiballZ_systemd_service.py'
Jan 21 18:05:13 compute-0 sudo[184361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:13 compute-0 python3.9[184363]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:05:14 compute-0 sudo[184361]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:14 compute-0 sudo[184514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mszrhrukjdezrkgwksrxpgortevmavnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018714.3581731-105-23885033460833/AnsiballZ_file.py'
Jan 21 18:05:14 compute-0 sudo[184514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:14 compute-0 python3.9[184516]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:14 compute-0 sudo[184514]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:14 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:05:14 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:05:15 compute-0 sudo[184667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjjpcajezgyngvwxteizqaopzmsfrnag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018715.2200658-121-6441494690118/AnsiballZ_file.py'
Jan 21 18:05:15 compute-0 sudo[184667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:15 compute-0 python3.9[184669]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:15 compute-0 sudo[184667]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:16 compute-0 sudo[184819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxmlkjmdxutfkfdzbjdlzywzzgsqkekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018715.9614346-139-29889939775248/AnsiballZ_command.py'
Jan 21 18:05:16 compute-0 sudo[184819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:16 compute-0 python3.9[184821]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:05:16 compute-0 sudo[184819]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:17 compute-0 python3.9[184973]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:05:18 compute-0 sudo[185123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvjurlnrpcpdzlsfcmnzsjhcqbhczgao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018717.8660028-175-153498132287782/AnsiballZ_systemd_service.py'
Jan 21 18:05:18 compute-0 sudo[185123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:18 compute-0 python3.9[185125]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:05:18 compute-0 systemd[1]: Reloading.
Jan 21 18:05:18 compute-0 systemd-sysv-generator[185150]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:05:18 compute-0 systemd-rc-local-generator[185147]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:05:18 compute-0 sudo[185123]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:19 compute-0 sudo[185310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grzqlvwrytktquwreltdwovcdabioxse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018718.9296012-191-275428828558895/AnsiballZ_command.py'
Jan 21 18:05:19 compute-0 sudo[185310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:19 compute-0 python3.9[185312]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:05:19 compute-0 sudo[185310]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:19 compute-0 sudo[185463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srerkdccbsmpspwomfdrzvqphhgxybtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018719.7589104-209-185134069770242/AnsiballZ_file.py'
Jan 21 18:05:19 compute-0 sudo[185463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:05:20.059 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:05:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:05:20.061 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:05:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:05:20.061 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:05:20 compute-0 python3.9[185465]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:05:20 compute-0 sudo[185463]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:21 compute-0 python3.9[185615]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:05:21 compute-0 sudo[185767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjenksgpxyobtmmivxecqzfbyrmgjsbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018721.3389242-241-90757137312354/AnsiballZ_group.py'
Jan 21 18:05:21 compute-0 sudo[185767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:21 compute-0 python3.9[185769]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 21 18:05:21 compute-0 sudo[185767]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:22 compute-0 sudo[185919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeybqmwnojoyeborliurkcutpptascmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018722.328545-263-130395305561084/AnsiballZ_getent.py'
Jan 21 18:05:22 compute-0 sudo[185919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:22 compute-0 python3.9[185921]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 21 18:05:22 compute-0 sudo[185919]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:23 compute-0 sudo[186072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmdmmfqjnykscevivebrngnmlfydgbfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018723.2433245-279-242249564684347/AnsiballZ_group.py'
Jan 21 18:05:23 compute-0 sudo[186072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:23 compute-0 python3.9[186074]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 18:05:23 compute-0 groupadd[186075]: group added to /etc/group: name=ceilometer, GID=42405
Jan 21 18:05:23 compute-0 groupadd[186075]: group added to /etc/gshadow: name=ceilometer
Jan 21 18:05:23 compute-0 groupadd[186075]: new group: name=ceilometer, GID=42405
Jan 21 18:05:23 compute-0 sudo[186072]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:24 compute-0 sudo[186230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vietkytqlakzoeljwlxczitcbtaurvpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018724.198714-295-84149577524434/AnsiballZ_user.py'
Jan 21 18:05:24 compute-0 sudo[186230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:24 compute-0 python3.9[186232]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 18:05:24 compute-0 useradd[186234]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 18:05:24 compute-0 useradd[186234]: add 'ceilometer' to group 'libvirt'
Jan 21 18:05:24 compute-0 useradd[186234]: add 'ceilometer' to shadow group 'libvirt'
Jan 21 18:05:25 compute-0 sudo[186230]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:25 compute-0 podman[186322]: 2026-01-21 18:05:25.998871724 +0000 UTC m=+0.051530589 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 18:05:26 compute-0 podman[186317]: 2026-01-21 18:05:26.053527768 +0000 UTC m=+0.107626158 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:05:26 compute-0 python3.9[186433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:26 compute-0 python3.9[186554]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769018725.8459175-347-114325585523202/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:27 compute-0 python3.9[186704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:27 compute-0 python3.9[186825]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769018727.1225073-347-160649920528722/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:28 compute-0 python3.9[186975]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:28 compute-0 nova_compute[183278]: 2026-01-21 18:05:28.706 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:28 compute-0 nova_compute[183278]: 2026-01-21 18:05:28.740 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:29 compute-0 python3.9[187096]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769018728.1300914-347-199779415035712/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:29 compute-0 python3.9[187246]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:05:30 compute-0 python3.9[187398]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:05:31 compute-0 python3.9[187550]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:31 compute-0 python3.9[187671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018730.7605584-465-267544285541962/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:05:32 compute-0 python3.9[187821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:32 compute-0 python3.9[187942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018731.8669446-465-186178753640600/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:05:33 compute-0 python3.9[188092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:34 compute-0 python3.9[188213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018733.3262656-523-84533413279490/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:05:35 compute-0 python3.9[188363]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:35 compute-0 python3.9[188484]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018734.6277413-555-202557479543526/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:36 compute-0 python3.9[188634]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:36 compute-0 python3.9[188755]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018735.746249-585-72495241097166/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:37 compute-0 python3.9[188905]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:37 compute-0 python3.9[189026]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018736.9723082-615-242280593307253/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:38 compute-0 sudo[189176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlepdhhuddvsdmqctrpruqxsgnrgqojn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018738.167222-645-57124703352231/AnsiballZ_file.py'
Jan 21 18:05:38 compute-0 sudo[189176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:38 compute-0 python3.9[189178]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:38 compute-0 sudo[189176]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:39 compute-0 sudo[189328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbbwhvidudlsuiygwycjqixuzvwrhjry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018738.8705473-661-145739722868699/AnsiballZ_file.py'
Jan 21 18:05:39 compute-0 sudo[189328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:39 compute-0 python3.9[189330]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:39 compute-0 sudo[189328]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:39 compute-0 python3.9[189480]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:05:40 compute-0 python3.9[189632]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:05:41 compute-0 python3.9[189784]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:05:41 compute-0 sudo[189936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faofcqpndgvohkuhgsridfhowptavujs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018741.7356875-725-89676211279955/AnsiballZ_file.py'
Jan 21 18:05:41 compute-0 sudo[189936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:42 compute-0 python3.9[189938]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:05:42 compute-0 sudo[189936]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:42 compute-0 sudo[190088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdlzssmdnlaxdycxzdabnkzqwnlhonys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018742.3599973-741-70297553334002/AnsiballZ_systemd_service.py'
Jan 21 18:05:42 compute-0 sudo[190088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:42 compute-0 python3.9[190090]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:05:42 compute-0 systemd[1]: Reloading.
Jan 21 18:05:43 compute-0 systemd-rc-local-generator[190119]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:05:43 compute-0 systemd-sysv-generator[190123]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:05:43 compute-0 systemd[1]: Listening on Podman API Socket.
Jan 21 18:05:43 compute-0 sudo[190088]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:43 compute-0 sudo[190278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylypnjnotnniyfodgtytfxehpkvtrmvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018743.6628268-759-90617430840063/AnsiballZ_stat.py'
Jan 21 18:05:43 compute-0 sudo[190278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:44 compute-0 python3.9[190280]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:05:44 compute-0 sudo[190278]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:44 compute-0 sudo[190401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohknvtrhujezjkirlcbuctkcvdzgajgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018743.6628268-759-90617430840063/AnsiballZ_copy.py'
Jan 21 18:05:44 compute-0 sudo[190401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:44 compute-0 python3.9[190403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018743.6628268-759-90617430840063/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:05:44 compute-0 sudo[190401]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:45 compute-0 sudo[190553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqdsklkhqalsskpgynhnfzgvdhjgyxfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018745.3810825-801-267112129503097/AnsiballZ_file.py'
Jan 21 18:05:45 compute-0 sudo[190553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:45 compute-0 python3.9[190555]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:45 compute-0 sudo[190553]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:46 compute-0 sudo[190705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shqbsgklxnoxwgvqfaznujjgkrgclmvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018746.2356875-817-89925323917879/AnsiballZ_file.py'
Jan 21 18:05:46 compute-0 sudo[190705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:46 compute-0 python3.9[190707]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:05:46 compute-0 sudo[190705]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:47 compute-0 python3.9[190857]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:49 compute-0 sudo[191278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdiymjmqjlcjsfxowrhnovwwzwjfgeaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018749.1213412-885-251735449772433/AnsiballZ_container_config_data.py'
Jan 21 18:05:49 compute-0 sudo[191278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:49 compute-0 python3.9[191280]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 21 18:05:49 compute-0 sudo[191278]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:50 compute-0 sudo[191430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odbtoktgeruobsnumbvhekjcgotpeqwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018750.2895975-907-45788318144441/AnsiballZ_container_config_hash.py'
Jan 21 18:05:50 compute-0 sudo[191430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:50 compute-0 python3.9[191432]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:05:50 compute-0 sudo[191430]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:51 compute-0 sudo[191582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elcovausqhfccpkwpxshuqyhnjolkzfv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018751.418798-927-60103662178044/AnsiballZ_edpm_container_manage.py'
Jan 21 18:05:51 compute-0 sudo[191582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:52 compute-0 python3[191584]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:05:53 compute-0 podman[191597]: 2026-01-21 18:05:53.918828578 +0000 UTC m=+1.574015794 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 21 18:05:54 compute-0 podman[191697]: 2026-01-21 18:05:54.096664763 +0000 UTC m=+0.065857694 container create 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:05:54 compute-0 podman[191697]: 2026-01-21 18:05:54.062639905 +0000 UTC m=+0.031832926 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 21 18:05:54 compute-0 python3[191584]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 21 18:05:54 compute-0 sudo[191582]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:54 compute-0 sudo[191882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrbciqjlejgerpdegbvzrdrdbbctlkak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018754.4219275-943-62839086354877/AnsiballZ_stat.py'
Jan 21 18:05:54 compute-0 sudo[191882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:54 compute-0 python3.9[191884]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:05:54 compute-0 sudo[191882]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:55 compute-0 sudo[192036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixjsqhkrzsmepgvquglizaclygnzxzco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018755.4816751-961-184874875616596/AnsiballZ_file.py'
Jan 21 18:05:55 compute-0 sudo[192036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:55 compute-0 python3.9[192038]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:55 compute-0 sudo[192036]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:56 compute-0 sudo[192131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxhstrojznbnyrfgjpknrwgcgehcjdgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018755.4816751-961-184874875616596/AnsiballZ_stat.py'
Jan 21 18:05:56 compute-0 sudo[192131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:56 compute-0 podman[192087]: 2026-01-21 18:05:56.250200206 +0000 UTC m=+0.075000133 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:05:56 compute-0 podman[192086]: 2026-01-21 18:05:56.295496035 +0000 UTC m=+0.116614794 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 18:05:56 compute-0 python3.9[192142]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:05:56 compute-0 sudo[192131]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:57 compute-0 sudo[192303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chydznzihoffjwgmwsiokwhafuwuorew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018756.5131042-961-274067940172063/AnsiballZ_copy.py'
Jan 21 18:05:57 compute-0 sudo[192303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:57 compute-0 python3.9[192305]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769018756.5131042-961-274067940172063/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:05:57 compute-0 sudo[192303]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:57 compute-0 sudo[192379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hacsylancczftzpemkujdtxskbnbqjbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018756.5131042-961-274067940172063/AnsiballZ_systemd.py'
Jan 21 18:05:57 compute-0 sudo[192379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:58 compute-0 python3.9[192381]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:05:58 compute-0 systemd[1]: Reloading.
Jan 21 18:05:58 compute-0 systemd-rc-local-generator[192406]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:05:58 compute-0 systemd-sysv-generator[192409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:05:58 compute-0 sudo[192379]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:58 compute-0 sudo[192490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djisdgmhxroryueuquppdsltvbiicrrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018756.5131042-961-274067940172063/AnsiballZ_systemd.py'
Jan 21 18:05:58 compute-0 sudo[192490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:05:59 compute-0 python3.9[192492]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:05:59 compute-0 systemd[1]: Reloading.
Jan 21 18:05:59 compute-0 systemd-sysv-generator[192525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:05:59 compute-0 systemd-rc-local-generator[192522]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:05:59 compute-0 systemd[1]: Starting podman_exporter container...
Jan 21 18:05:59 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:05:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87970768d83ab0e613cf12560f033b84289bc987256bfff681f5a5ae275a6bfd/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 18:05:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87970768d83ab0e613cf12560f033b84289bc987256bfff681f5a5ae275a6bfd/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 18:05:59 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb.
Jan 21 18:05:59 compute-0 podman[192532]: 2026-01-21 18:05:59.522798888 +0000 UTC m=+0.124026472 container init 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:05:59 compute-0 podman_exporter[192548]: ts=2026-01-21T18:05:59.537Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 21 18:05:59 compute-0 podman_exporter[192548]: ts=2026-01-21T18:05:59.537Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 21 18:05:59 compute-0 podman_exporter[192548]: ts=2026-01-21T18:05:59.537Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 21 18:05:59 compute-0 podman_exporter[192548]: ts=2026-01-21T18:05:59.537Z caller=handler.go:105 level=info collector=container
Jan 21 18:05:59 compute-0 podman[192532]: 2026-01-21 18:05:59.546797985 +0000 UTC m=+0.148025559 container start 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:05:59 compute-0 podman[192532]: podman_exporter
Jan 21 18:05:59 compute-0 systemd[1]: Starting Podman API Service...
Jan 21 18:05:59 compute-0 systemd[1]: Started Podman API Service.
Jan 21 18:05:59 compute-0 systemd[1]: Started podman_exporter container.
Jan 21 18:05:59 compute-0 podman[192560]: time="2026-01-21T18:05:59Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 21 18:05:59 compute-0 podman[192560]: time="2026-01-21T18:05:59Z" level=info msg="Setting parallel job count to 25"
Jan 21 18:05:59 compute-0 podman[192560]: time="2026-01-21T18:05:59Z" level=info msg="Using sqlite as database backend"
Jan 21 18:05:59 compute-0 sudo[192490]: pam_unix(sudo:session): session closed for user root
Jan 21 18:05:59 compute-0 podman[192560]: time="2026-01-21T18:05:59Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 21 18:05:59 compute-0 podman[192560]: time="2026-01-21T18:05:59Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 21 18:05:59 compute-0 podman[192560]: time="2026-01-21T18:05:59Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 21 18:05:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:05:59 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 21 18:05:59 compute-0 podman[192560]: time="2026-01-21T18:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:05:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 12121 "" "Go-http-client/1.1"
Jan 21 18:05:59 compute-0 podman_exporter[192548]: ts=2026-01-21T18:05:59.610Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 21 18:05:59 compute-0 podman_exporter[192548]: ts=2026-01-21T18:05:59.611Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 21 18:05:59 compute-0 podman_exporter[192548]: ts=2026-01-21T18:05:59.611Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 21 18:05:59 compute-0 podman[192557]: 2026-01-21 18:05:59.635231951 +0000 UTC m=+0.077258168 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:05:59 compute-0 systemd[1]: 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb-5910d0a329b017a4.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 18:05:59 compute-0 systemd[1]: 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb-5910d0a329b017a4.service: Failed with result 'exit-code'.
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.840 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.840 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.841 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.841 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.841 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.841 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.841 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.842 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.842 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.866 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.866 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.867 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:05:59 compute-0 nova_compute[183278]: 2026-01-21 18:05:59.867 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.061 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.063 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6083MB free_disk=73.53124237060547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.063 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.063 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.134 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.134 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.175 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.191 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.192 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:06:00 compute-0 nova_compute[183278]: 2026-01-21 18:06:00.192 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:06:00 compute-0 python3.9[192744]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 18:06:01 compute-0 anacron[143712]: Job `cron.daily' started
Jan 21 18:06:01 compute-0 anacron[143712]: Job `cron.daily' terminated
Jan 21 18:06:01 compute-0 sudo[192896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbneaceevlqgoyxmxsfhtmginenplunr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018761.309937-1051-163876922947749/AnsiballZ_stat.py'
Jan 21 18:06:01 compute-0 sudo[192896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:01 compute-0 python3.9[192898]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:06:01 compute-0 sudo[192896]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:02 compute-0 sudo[193021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqkmytewbafwmncmzqjwjdjzpfagcmbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018761.309937-1051-163876922947749/AnsiballZ_copy.py'
Jan 21 18:06:02 compute-0 sudo[193021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:02 compute-0 python3.9[193023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018761.309937-1051-163876922947749/.source.yaml _original_basename=.hicw_ztg follow=False checksum=adb9d3a8c29ddf899638ebcd7e0212ae2a63754b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:02 compute-0 sudo[193021]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:03 compute-0 sudo[193173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzpccpsvqualaoiiuyyptebtmlqbzirc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018762.8162365-1081-280517829274892/AnsiballZ_stat.py'
Jan 21 18:06:03 compute-0 sudo[193173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:03 compute-0 python3.9[193175]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:06:03 compute-0 sudo[193173]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:03 compute-0 sudo[193296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijbimkqkpmjhbpiiixuqvijjjxwmtwqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018762.8162365-1081-280517829274892/AnsiballZ_copy.py'
Jan 21 18:06:03 compute-0 sudo[193296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:03 compute-0 python3.9[193298]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769018762.8162365-1081-280517829274892/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:06:03 compute-0 sudo[193296]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:04 compute-0 sudo[193448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqccqomhokalfklexlcrnjquyynoxmjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018764.4549882-1123-180038064193245/AnsiballZ_file.py'
Jan 21 18:06:04 compute-0 sudo[193448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:04 compute-0 python3.9[193450]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:04 compute-0 sudo[193448]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:05 compute-0 sudo[193600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tslsqvwfoavdvbboedjuaikrbghwtnvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018765.0999727-1139-24408472702895/AnsiballZ_file.py'
Jan 21 18:06:05 compute-0 sudo[193600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:05 compute-0 python3.9[193602]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:06:05 compute-0 sudo[193600]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:06 compute-0 python3.9[193752]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:08 compute-0 sudo[194173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kneshsuljkinxlqdkvfvauojqbehdorr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018768.619189-1207-220440082089159/AnsiballZ_container_config_data.py'
Jan 21 18:06:08 compute-0 sudo[194173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:09 compute-0 python3.9[194175]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 21 18:06:09 compute-0 sudo[194173]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:09 compute-0 sudo[194325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sezuqeucxyhuwolrhiaspoykhmoeavuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018769.5432322-1229-8274523556004/AnsiballZ_container_config_hash.py'
Jan 21 18:06:09 compute-0 sudo[194325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:10 compute-0 python3.9[194327]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:06:10 compute-0 sudo[194325]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:10 compute-0 sudo[194477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icsaayhfhwxkfdwqciwqteilepbpyzjz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018770.4493055-1249-179663394203952/AnsiballZ_edpm_container_manage.py'
Jan 21 18:06:10 compute-0 sudo[194477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:11 compute-0 python3[194479]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:06:13 compute-0 podman[194490]: 2026-01-21 18:06:13.423783947 +0000 UTC m=+2.342392033 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 18:06:13 compute-0 podman[194588]: 2026-01-21 18:06:13.546450995 +0000 UTC m=+0.043763103 container create 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Jan 21 18:06:13 compute-0 podman[194588]: 2026-01-21 18:06:13.524444586 +0000 UTC m=+0.021756734 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 18:06:13 compute-0 python3[194479]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 18:06:13 compute-0 sudo[194477]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:14 compute-0 sudo[194776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itfpnhbsopxarixkbkebtpqyyvkjmryb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018774.534367-1265-98750376953179/AnsiballZ_stat.py'
Jan 21 18:06:14 compute-0 sudo[194776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:14 compute-0 python3.9[194778]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:06:15 compute-0 sudo[194776]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:15 compute-0 sudo[194930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aywizkqwtohduoprlaritntkazzijvxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018775.3289888-1283-60594186564005/AnsiballZ_file.py'
Jan 21 18:06:15 compute-0 sudo[194930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:15 compute-0 python3.9[194932]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:15 compute-0 sudo[194930]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:15 compute-0 sudo[195006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwjoswkdxxkfjjfmmsolihoobdohepyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018775.3289888-1283-60594186564005/AnsiballZ_stat.py'
Jan 21 18:06:15 compute-0 sudo[195006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:16 compute-0 python3.9[195008]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:06:16 compute-0 sudo[195006]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:16 compute-0 sudo[195157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syygqznzycskioqqiifqdctsohgtogcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018776.2269344-1283-231277273862573/AnsiballZ_copy.py'
Jan 21 18:06:16 compute-0 sudo[195157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:16 compute-0 python3.9[195159]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769018776.2269344-1283-231277273862573/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:16 compute-0 sudo[195157]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:17 compute-0 sudo[195233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojkxlyhyqrcfuqlfuaaulwweffrkxrvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018776.2269344-1283-231277273862573/AnsiballZ_systemd.py'
Jan 21 18:06:17 compute-0 sudo[195233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:17 compute-0 python3.9[195235]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:06:17 compute-0 systemd[1]: Reloading.
Jan 21 18:06:17 compute-0 systemd-rc-local-generator[195261]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:06:17 compute-0 systemd-sysv-generator[195265]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:06:17 compute-0 sudo[195233]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:17 compute-0 sudo[195345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggtsiozjnjhzjsfuteairtsksypjzzmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018776.2269344-1283-231277273862573/AnsiballZ_systemd.py'
Jan 21 18:06:17 compute-0 sudo[195345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:18 compute-0 python3.9[195347]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:06:18 compute-0 systemd[1]: Reloading.
Jan 21 18:06:18 compute-0 systemd-sysv-generator[195379]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:06:18 compute-0 systemd-rc-local-generator[195374]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:06:18 compute-0 systemd[1]: Starting openstack_network_exporter container...
Jan 21 18:06:18 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:06:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01db6fdcbd150d482d172db9fb31c086b3d2fe478a5783b78da97282a095fbb0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 21 18:06:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01db6fdcbd150d482d172db9fb31c086b3d2fe478a5783b78da97282a095fbb0/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 18:06:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01db6fdcbd150d482d172db9fb31c086b3d2fe478a5783b78da97282a095fbb0/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 18:06:18 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b.
Jan 21 18:06:18 compute-0 podman[195387]: 2026-01-21 18:06:18.633692553 +0000 UTC m=+0.115938738 container init 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:48: registering *bridge.Collector
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:48: registering *coverage.Collector
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:48: registering *datapath.Collector
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:48: registering *iface.Collector
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:48: registering *memory.Collector
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:48: registering *ovn.Collector
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:48: registering *pmd_perf.Collector
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:48: registering *pmd_rxq.Collector
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: INFO    18:06:18 main.go:48: registering *vswitch.Collector
Jan 21 18:06:18 compute-0 openstack_network_exporter[195402]: NOTICE  18:06:18 main.go:76: listening on https://:9105/metrics
Jan 21 18:06:18 compute-0 podman[195387]: 2026-01-21 18:06:18.673204813 +0000 UTC m=+0.155450978 container start 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-type=git)
Jan 21 18:06:18 compute-0 podman[195387]: openstack_network_exporter
Jan 21 18:06:18 compute-0 systemd[1]: Started openstack_network_exporter container.
Jan 21 18:06:18 compute-0 sudo[195345]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:18 compute-0 podman[195412]: 2026-01-21 18:06:18.768908824 +0000 UTC m=+0.079979544 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:06:19 compute-0 python3.9[195587]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 18:06:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:06:20.060 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:06:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:06:20.061 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:06:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:06:20.061 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:06:20 compute-0 sudo[195737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbjhvsfgbciariwceknxkczzonkyxbxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018780.5955405-1373-247182534059786/AnsiballZ_stat.py'
Jan 21 18:06:20 compute-0 sudo[195737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:21 compute-0 python3.9[195739]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:06:21 compute-0 sudo[195737]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:21 compute-0 sudo[195862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmwkkonxkeamwooksfjodmpxrqekjerk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018780.5955405-1373-247182534059786/AnsiballZ_copy.py'
Jan 21 18:06:21 compute-0 sudo[195862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:21 compute-0 python3.9[195864]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018780.5955405-1373-247182534059786/.source.yaml _original_basename=.42zalx38 follow=False checksum=251d59b9aca4861f5212c6f5466da45fed9562ca backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:21 compute-0 sudo[195862]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:22 compute-0 sudo[196014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byjtoicgnxlbkxjsbtibemsgctdruaam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018782.0975814-1403-269919203066368/AnsiballZ_find.py'
Jan 21 18:06:22 compute-0 sudo[196014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:22 compute-0 python3.9[196016]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:06:22 compute-0 sudo[196014]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:23 compute-0 sudo[196166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxuhtbblgbppfdwejnzhcmodhmiklmnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018782.9088123-1422-29826245182530/AnsiballZ_podman_container_info.py'
Jan 21 18:06:23 compute-0 sudo[196166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:23 compute-0 python3.9[196168]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 21 18:06:23 compute-0 sudo[196166]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:24 compute-0 sudo[196331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dticuinqvonddpcuxczcynysssekvxnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018783.70564-1430-266494393432344/AnsiballZ_podman_container_exec.py'
Jan 21 18:06:24 compute-0 sudo[196331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:24 compute-0 python3.9[196333]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:06:24 compute-0 systemd[1]: Started libpod-conmon-16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a.scope.
Jan 21 18:06:24 compute-0 podman[196334]: 2026-01-21 18:06:24.471280048 +0000 UTC m=+0.098656742 container exec 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 18:06:24 compute-0 podman[196334]: 2026-01-21 18:06:24.482786116 +0000 UTC m=+0.110162790 container exec_died 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 21 18:06:24 compute-0 sudo[196331]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:24 compute-0 systemd[1]: libpod-conmon-16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a.scope: Deactivated successfully.
Jan 21 18:06:25 compute-0 sudo[196514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcmuogaqvkiujydrsiexfsndmiwqvduf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018784.7113254-1438-189966301578102/AnsiballZ_podman_container_exec.py'
Jan 21 18:06:25 compute-0 sudo[196514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:25 compute-0 python3.9[196516]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:06:25 compute-0 systemd[1]: Started libpod-conmon-16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a.scope.
Jan 21 18:06:25 compute-0 podman[196517]: 2026-01-21 18:06:25.396006336 +0000 UTC m=+0.070047335 container exec 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 21 18:06:25 compute-0 podman[196517]: 2026-01-21 18:06:25.42990525 +0000 UTC m=+0.103946229 container exec_died 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 21 18:06:25 compute-0 systemd[1]: libpod-conmon-16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a.scope: Deactivated successfully.
Jan 21 18:06:25 compute-0 sudo[196514]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:25 compute-0 sudo[196699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vormmhajwqjkepwmgheugxylknndmbjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018785.6294327-1446-242128992821376/AnsiballZ_file.py'
Jan 21 18:06:25 compute-0 sudo[196699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:26 compute-0 python3.9[196701]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:26 compute-0 sudo[196699]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:26 compute-0 podman[196826]: 2026-01-21 18:06:26.675056209 +0000 UTC m=+0.070041085 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 21 18:06:26 compute-0 sudo[196882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scohsteeiwviadnptyyluvmogvwwufdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018786.3401678-1455-112654038978896/AnsiballZ_podman_container_info.py'
Jan 21 18:06:26 compute-0 sudo[196882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:26 compute-0 podman[196825]: 2026-01-21 18:06:26.722389117 +0000 UTC m=+0.117796882 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:06:26 compute-0 python3.9[196895]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 21 18:06:26 compute-0 sudo[196882]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:27 compute-0 sudo[197063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehiivqmcsqktlbevzatlltqouazovsgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018787.2955565-1463-136047033009901/AnsiballZ_podman_container_exec.py'
Jan 21 18:06:27 compute-0 sudo[197063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:27 compute-0 python3.9[197065]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:06:27 compute-0 systemd[1]: Started libpod-conmon-db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456.scope.
Jan 21 18:06:27 compute-0 podman[197066]: 2026-01-21 18:06:27.909112 +0000 UTC m=+0.071375753 container exec db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 18:06:27 compute-0 podman[197066]: 2026-01-21 18:06:27.944096093 +0000 UTC m=+0.106359856 container exec_died db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:06:27 compute-0 sudo[197063]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:27 compute-0 systemd[1]: libpod-conmon-db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456.scope: Deactivated successfully.
Jan 21 18:06:28 compute-0 sudo[197247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubltnftkvelyujevfkfcxdnjcrqbdtip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018788.1592717-1471-44271859889843/AnsiballZ_podman_container_exec.py'
Jan 21 18:06:28 compute-0 sudo[197247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:28 compute-0 python3.9[197249]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:06:28 compute-0 systemd[1]: Started libpod-conmon-db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456.scope.
Jan 21 18:06:28 compute-0 podman[197250]: 2026-01-21 18:06:28.773572728 +0000 UTC m=+0.120361309 container exec db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:06:28 compute-0 podman[197250]: 2026-01-21 18:06:28.805901444 +0000 UTC m=+0.152690025 container exec_died db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:06:28 compute-0 systemd[1]: libpod-conmon-db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456.scope: Deactivated successfully.
Jan 21 18:06:28 compute-0 sudo[197247]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:29 compute-0 sudo[197431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwzvuulbfchdjfzsptthdkuemijorlmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018788.990958-1479-270923470619358/AnsiballZ_file.py'
Jan 21 18:06:29 compute-0 sudo[197431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:29 compute-0 python3.9[197433]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:29 compute-0 sudo[197431]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:29 compute-0 sudo[197596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqcorbicaxpyorjspiapbajzavtuxtxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018789.6177464-1488-120763805326251/AnsiballZ_podman_container_info.py'
Jan 21 18:06:29 compute-0 sudo[197596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:29 compute-0 podman[197557]: 2026-01-21 18:06:29.877467139 +0000 UTC m=+0.052909055 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:06:30 compute-0 python3.9[197604]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 21 18:06:30 compute-0 sudo[197596]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:30 compute-0 sudo[197773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikvruqfstjcnuqaypdoiacehbiqhncrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018790.2812672-1496-250103441284489/AnsiballZ_podman_container_exec.py'
Jan 21 18:06:30 compute-0 sudo[197773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:30 compute-0 python3.9[197775]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:06:30 compute-0 systemd[1]: Started libpod-conmon-602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb.scope.
Jan 21 18:06:30 compute-0 podman[197776]: 2026-01-21 18:06:30.781428585 +0000 UTC m=+0.072552692 container exec 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:06:30 compute-0 podman[197776]: 2026-01-21 18:06:30.814793808 +0000 UTC m=+0.105917915 container exec_died 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:06:30 compute-0 systemd[1]: libpod-conmon-602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb.scope: Deactivated successfully.
Jan 21 18:06:30 compute-0 sudo[197773]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:31 compute-0 sudo[197957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emyigdzsirbfewnwtwtckmkdnwzazklz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018791.0185804-1504-108770576102775/AnsiballZ_podman_container_exec.py'
Jan 21 18:06:31 compute-0 sudo[197957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:31 compute-0 python3.9[197959]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:06:31 compute-0 systemd[1]: Started libpod-conmon-602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb.scope.
Jan 21 18:06:31 compute-0 podman[197960]: 2026-01-21 18:06:31.662721639 +0000 UTC m=+0.084621337 container exec 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:06:31 compute-0 podman[197960]: 2026-01-21 18:06:31.698188034 +0000 UTC m=+0.120087732 container exec_died 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:06:31 compute-0 sudo[197957]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:31 compute-0 systemd[1]: libpod-conmon-602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb.scope: Deactivated successfully.
Jan 21 18:06:32 compute-0 sudo[198141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycqetupxkivlbzerxceugqidztbemxit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018791.9396095-1512-17502636332028/AnsiballZ_file.py'
Jan 21 18:06:32 compute-0 sudo[198141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:32 compute-0 python3.9[198143]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:32 compute-0 sudo[198141]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:32 compute-0 sudo[198293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byqtdxgjimgfkujswvkqvzwumrjhzejn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018792.6043992-1521-197686676263940/AnsiballZ_podman_container_info.py'
Jan 21 18:06:32 compute-0 sudo[198293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:33 compute-0 python3.9[198295]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 21 18:06:33 compute-0 sudo[198293]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:33 compute-0 sudo[198459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjewwrtngtnsgkhnzicfstiruiokemlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018793.2979467-1529-25439668295510/AnsiballZ_podman_container_exec.py'
Jan 21 18:06:33 compute-0 sudo[198459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:33 compute-0 python3.9[198461]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:06:33 compute-0 systemd[1]: Started libpod-conmon-9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b.scope.
Jan 21 18:06:33 compute-0 podman[198462]: 2026-01-21 18:06:33.836627117 +0000 UTC m=+0.069155346 container exec 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Jan 21 18:06:33 compute-0 podman[198462]: 2026-01-21 18:06:33.86604717 +0000 UTC m=+0.098575409 container exec_died 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 18:06:33 compute-0 systemd[1]: libpod-conmon-9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b.scope: Deactivated successfully.
Jan 21 18:06:33 compute-0 sudo[198459]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:34 compute-0 sudo[198643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taakidskgrxpyxvcsbacxzzforkgkqtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018794.0834246-1537-71307352725583/AnsiballZ_podman_container_exec.py'
Jan 21 18:06:34 compute-0 sudo[198643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:34 compute-0 python3.9[198645]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:06:34 compute-0 systemd[1]: Started libpod-conmon-9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b.scope.
Jan 21 18:06:34 compute-0 podman[198646]: 2026-01-21 18:06:34.615616899 +0000 UTC m=+0.066683834 container exec 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 18:06:34 compute-0 podman[198646]: 2026-01-21 18:06:34.64894289 +0000 UTC m=+0.100009825 container exec_died 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:06:34 compute-0 systemd[1]: libpod-conmon-9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b.scope: Deactivated successfully.
Jan 21 18:06:34 compute-0 sudo[198643]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:35 compute-0 sudo[198826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msksuzllotmktfbiduapyhpjkkykbgsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018794.869918-1545-98613246811783/AnsiballZ_file.py'
Jan 21 18:06:35 compute-0 sudo[198826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:35 compute-0 python3.9[198828]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:35 compute-0 sudo[198826]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:48 compute-0 sudo[198978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qywlqiwofvhdeaaxivlimihwugqzmehe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018808.4004068-1687-272671945043594/AnsiballZ_file.py'
Jan 21 18:06:48 compute-0 sudo[198978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:48 compute-0 python3.9[198980]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:48 compute-0 sudo[198978]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:49 compute-0 podman[198981]: 2026-01-21 18:06:49.022481809 +0000 UTC m=+0.074790528 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:06:49 compute-0 sudo[199151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcelryhenalxreqsudjhezpprexmogzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018809.1255138-1703-281227881769225/AnsiballZ_stat.py'
Jan 21 18:06:49 compute-0 sudo[199151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:49 compute-0 python3.9[199153]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:06:49 compute-0 sudo[199151]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:49 compute-0 sudo[199274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jstnvuladgckjxqzudifblfeptnoloir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018809.1255138-1703-281227881769225/AnsiballZ_copy.py'
Jan 21 18:06:49 compute-0 sudo[199274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:50 compute-0 python3.9[199276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769018809.1255138-1703-281227881769225/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:50 compute-0 sudo[199274]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:50 compute-0 sudo[199426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywnrcihvubxvnexkulmcngucbrtwvfyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018810.5064461-1735-198246492557128/AnsiballZ_file.py'
Jan 21 18:06:50 compute-0 sudo[199426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:50 compute-0 python3.9[199428]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:50 compute-0 sudo[199426]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:51 compute-0 sudo[199578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbjfffjsesvmqimhhioxrouorsziwgej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018811.1653376-1751-190399223544471/AnsiballZ_stat.py'
Jan 21 18:06:51 compute-0 sudo[199578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:51 compute-0 python3.9[199580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:06:51 compute-0 sudo[199578]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:51 compute-0 sudo[199656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kapfxazzrqhqijrppogamkkndyggpaov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018811.1653376-1751-190399223544471/AnsiballZ_file.py'
Jan 21 18:06:51 compute-0 sudo[199656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:52 compute-0 python3.9[199658]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:52 compute-0 sudo[199656]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:52 compute-0 sudo[199808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhyqagnxmddfnzpprhxskranjmsxmzli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018812.701692-1775-93708857505448/AnsiballZ_stat.py'
Jan 21 18:06:52 compute-0 sudo[199808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:53 compute-0 python3.9[199810]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:06:53 compute-0 sudo[199808]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:53 compute-0 sudo[199886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtwvflivisvojlmurvimrvvcwtzorlgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018812.701692-1775-93708857505448/AnsiballZ_file.py'
Jan 21 18:06:53 compute-0 sudo[199886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:53 compute-0 python3.9[199888]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.eb67ido3 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:53 compute-0 sudo[199886]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:53 compute-0 sshd-session[199889]: Invalid user ansible_user from 64.227.98.100 port 59820
Jan 21 18:06:53 compute-0 sshd-session[199889]: Connection closed by invalid user ansible_user 64.227.98.100 port 59820 [preauth]
Jan 21 18:06:54 compute-0 sudo[200040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sybllmposwpunqftqnpxbwszdxkpiuax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018814.1734192-1799-133105146354156/AnsiballZ_stat.py'
Jan 21 18:06:54 compute-0 sudo[200040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:54 compute-0 python3.9[200042]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:06:54 compute-0 sudo[200040]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:55 compute-0 sudo[200118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jteonpuexrhelcdhpfldybxxblmeiaiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018814.1734192-1799-133105146354156/AnsiballZ_file.py'
Jan 21 18:06:55 compute-0 sudo[200118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:55 compute-0 python3.9[200120]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:55 compute-0 sudo[200118]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:56 compute-0 sudo[200270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfpbjpyaugutshuuxxrsocbzvexriisp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018816.1010926-1825-130959771613261/AnsiballZ_command.py'
Jan 21 18:06:56 compute-0 sudo[200270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:56 compute-0 python3.9[200272]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:06:56 compute-0 sudo[200270]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:57 compute-0 podman[200351]: 2026-01-21 18:06:57.036895158 +0000 UTC m=+0.088075794 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:06:57 compute-0 podman[200350]: 2026-01-21 18:06:57.037895643 +0000 UTC m=+0.089311755 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:06:57 compute-0 sudo[200466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emxwmkfpgdyusdmvezmejspjlqxclqlv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769018816.8341572-1841-93330376038377/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 18:06:57 compute-0 sudo[200466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:57 compute-0 python3[200468]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 18:06:57 compute-0 sudo[200466]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:57 compute-0 sudo[200618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppfrmekkzmfcgcbndzohilkkecjiwdjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018817.6616702-1857-82195954687828/AnsiballZ_stat.py'
Jan 21 18:06:57 compute-0 sudo[200618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:58 compute-0 python3.9[200620]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:06:58 compute-0 sudo[200618]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:58 compute-0 sudo[200696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skratdejfzlmxbckswhdbgrrmdnjrrur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018817.6616702-1857-82195954687828/AnsiballZ_file.py'
Jan 21 18:06:58 compute-0 sudo[200696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:58 compute-0 python3.9[200698]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:58 compute-0 sudo[200696]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:59 compute-0 sudo[200848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usbttisylagxkxyibdztfnkakeihapms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018818.7875013-1881-125653147436775/AnsiballZ_stat.py'
Jan 21 18:06:59 compute-0 sudo[200848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:59 compute-0 python3.9[200850]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:06:59 compute-0 sudo[200848]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:59 compute-0 sudo[200926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvkaagwkzflicdwinfgkqyjvgodtuwsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018818.7875013-1881-125653147436775/AnsiballZ_file.py'
Jan 21 18:06:59 compute-0 sudo[200926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:06:59 compute-0 python3.9[200928]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:06:59 compute-0 sudo[200926]: pam_unix(sudo:session): session closed for user root
Jan 21 18:06:59 compute-0 podman[200966]: 2026-01-21 18:06:59.989482229 +0000 UTC m=+0.043225052 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.276 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:00 compute-0 auditd[700]: Audit daemon rotating log files
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.300 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.300 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.301 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.314 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.314 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.314 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.315 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.341 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.341 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.341 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.341 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:07:00 compute-0 sudo[201102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqzcyquwoogpdbvlczqdnzgkuzmecuxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018819.9415185-1905-220836022373881/AnsiballZ_stat.py'
Jan 21 18:07:00 compute-0 sudo[201102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.494 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.495 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5994MB free_disk=73.41357421875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.495 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.496 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.564 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.565 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.587 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:07:00 compute-0 python3.9[201104]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:07:00 compute-0 sudo[201102]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.727 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.728 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:07:00 compute-0 nova_compute[183278]: 2026-01-21 18:07:00.729 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:07:00 compute-0 sudo[201180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnvlwiwlkcgbmebppzeesrbguldpvxgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018819.9415185-1905-220836022373881/AnsiballZ_file.py'
Jan 21 18:07:00 compute-0 sudo[201180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:01 compute-0 python3.9[201182]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:07:01 compute-0 sudo[201180]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:01 compute-0 nova_compute[183278]: 2026-01-21 18:07:01.231 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:01 compute-0 nova_compute[183278]: 2026-01-21 18:07:01.231 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:01 compute-0 nova_compute[183278]: 2026-01-21 18:07:01.232 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:07:01 compute-0 sudo[201332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shrwzqwftwvfmocljszhvspcogtlvihs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018821.2295325-1929-194862087064143/AnsiballZ_stat.py'
Jan 21 18:07:01 compute-0 sudo[201332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:01 compute-0 nova_compute[183278]: 2026-01-21 18:07:01.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:01 compute-0 nova_compute[183278]: 2026-01-21 18:07:01.815 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:01 compute-0 nova_compute[183278]: 2026-01-21 18:07:01.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:07:01 compute-0 python3.9[201334]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:07:01 compute-0 sudo[201332]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:02 compute-0 sudo[201410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rruzbuyuzmcpqivlhgslblecwxvgqxry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018821.2295325-1929-194862087064143/AnsiballZ_file.py'
Jan 21 18:07:02 compute-0 sudo[201410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:02 compute-0 python3.9[201412]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:07:02 compute-0 sudo[201410]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:02 compute-0 sudo[201562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbhrgrnemrinfcreqewuevrhahkqbipd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018822.5702047-1953-106342163469879/AnsiballZ_stat.py'
Jan 21 18:07:02 compute-0 sudo[201562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:03 compute-0 python3.9[201564]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:07:03 compute-0 sudo[201562]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:03 compute-0 sudo[201687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-invmrzjhynshyrfyhujxrxjinvershzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018822.5702047-1953-106342163469879/AnsiballZ_copy.py'
Jan 21 18:07:03 compute-0 sudo[201687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:03 compute-0 python3.9[201689]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769018822.5702047-1953-106342163469879/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:07:03 compute-0 sudo[201687]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:04 compute-0 sudo[201839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smkousixwverdzmnfjhovlggnbdnysix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018823.9120069-1983-112100266435964/AnsiballZ_file.py'
Jan 21 18:07:04 compute-0 sudo[201839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:04 compute-0 python3.9[201841]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:07:04 compute-0 sudo[201839]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:04 compute-0 sudo[201991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alxvoucuejphvxjmvwgljumxuwrtpigj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018824.5863998-1999-181269194977133/AnsiballZ_command.py'
Jan 21 18:07:04 compute-0 sudo[201991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:05 compute-0 python3.9[201993]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:07:05 compute-0 sudo[201991]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:05 compute-0 sudo[202146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bscpeyxbkuppgfqmaeudseaomrtpgcqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018825.3318768-2015-74164307516264/AnsiballZ_blockinfile.py'
Jan 21 18:07:05 compute-0 sudo[202146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:05 compute-0 python3.9[202148]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:07:05 compute-0 sudo[202146]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:06 compute-0 sudo[202298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aevivkngwspijbhsnhexmnuhoytjqbcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018826.237289-2033-192581235625583/AnsiballZ_command.py'
Jan 21 18:07:06 compute-0 sudo[202298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:06 compute-0 python3.9[202300]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:07:06 compute-0 sudo[202298]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:07 compute-0 sudo[202451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jswaifajmqdjzcueeiohuvlwargmpsev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018826.9376154-2049-96557446334/AnsiballZ_stat.py'
Jan 21 18:07:07 compute-0 sudo[202451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:07 compute-0 python3.9[202453]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:07:07 compute-0 sudo[202451]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:07 compute-0 sudo[202605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtugxfxzsrgigespkfgskcoltjyrpceb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018827.7019026-2065-31062079432898/AnsiballZ_command.py'
Jan 21 18:07:07 compute-0 sudo[202605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:08 compute-0 python3.9[202607]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:07:08 compute-0 sudo[202605]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:08 compute-0 openstack_network_exporter[195402]: ERROR   18:07:08 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:07:08 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:07:08 compute-0 openstack_network_exporter[195402]: ERROR   18:07:08 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:07:08 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:07:08 compute-0 sudo[202765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebvzhgrjgvjuwuujkwgeozodykmbogfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769018828.3980439-2081-249430104295824/AnsiballZ_file.py'
Jan 21 18:07:08 compute-0 sudo[202765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:07:08 compute-0 python3.9[202767]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:07:08 compute-0 sudo[202765]: pam_unix(sudo:session): session closed for user root
Jan 21 18:07:09 compute-0 sshd-session[183581]: Connection closed by 192.168.122.30 port 45112
Jan 21 18:07:09 compute-0 sshd-session[183578]: pam_unix(sshd:session): session closed for user zuul
Jan 21 18:07:09 compute-0 systemd-logind[782]: Session 27 logged out. Waiting for processes to exit.
Jan 21 18:07:09 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Jan 21 18:07:09 compute-0 systemd[1]: session-27.scope: Consumed 1min 13.302s CPU time.
Jan 21 18:07:09 compute-0 systemd-logind[782]: Removed session 27.
Jan 21 18:07:20 compute-0 podman[202792]: 2026-01-21 18:07:20.007294818 +0000 UTC m=+0.060717713 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible)
Jan 21 18:07:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:07:20.062 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:07:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:07:20.062 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:07:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:07:20.062 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:07:28 compute-0 podman[202814]: 2026-01-21 18:07:28.037972109 +0000 UTC m=+0.093707676 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:07:28 compute-0 podman[202815]: 2026-01-21 18:07:28.038681077 +0000 UTC m=+0.084917165 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:07:29 compute-0 podman[192560]: time="2026-01-21T18:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:07:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:07:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2143 "" "Go-http-client/1.1"
Jan 21 18:07:30 compute-0 podman[202862]: 2026-01-21 18:07:30.983333519 +0000 UTC m=+0.041277013 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:07:31 compute-0 openstack_network_exporter[195402]: ERROR   18:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:07:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:07:31 compute-0 openstack_network_exporter[195402]: ERROR   18:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:07:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:07:50 compute-0 podman[202887]: 2026-01-21 18:07:50.991276223 +0000 UTC m=+0.052911502 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:07:58 compute-0 podman[202909]: 2026-01-21 18:07:58.991467865 +0000 UTC m=+0.047241753 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 21 18:07:59 compute-0 podman[202908]: 2026-01-21 18:07:59.020057598 +0000 UTC m=+0.079905616 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.836 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.837 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.837 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.865 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.865 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.866 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:08:00 compute-0 nova_compute[183278]: 2026-01-21 18:08:00.866 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.009 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.010 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6126MB free_disk=73.41801452636719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.010 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.010 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.086 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.087 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:08:01 compute-0 rsyslogd[1002]: imjournal from <np0005590981:nova_compute>: begin to drop messages due to rate-limiting
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.110 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.125 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.126 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:08:01 compute-0 nova_compute[183278]: 2026-01-21 18:08:01.126 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:08:01 compute-0 podman[202953]: 2026-01-21 18:08:01.982648944 +0000 UTC m=+0.042746261 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:08:02 compute-0 nova_compute[183278]: 2026-01-21 18:08:02.105 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:08:02 compute-0 nova_compute[183278]: 2026-01-21 18:08:02.106 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:08:02 compute-0 nova_compute[183278]: 2026-01-21 18:08:02.106 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:08:02 compute-0 nova_compute[183278]: 2026-01-21 18:08:02.815 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:08:02 compute-0 nova_compute[183278]: 2026-01-21 18:08:02.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:08:02 compute-0 nova_compute[183278]: 2026-01-21 18:08:02.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:08:03 compute-0 nova_compute[183278]: 2026-01-21 18:08:03.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:08:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:08:20.063 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:08:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:08:20.063 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:08:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:08:20.063 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:08:22 compute-0 podman[202977]: 2026-01-21 18:08:22.007490873 +0000 UTC m=+0.068645998 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Jan 21 18:08:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:08:23.065 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:08:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:08:23.066 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:08:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:08:23.067 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:08:29 compute-0 podman[192560]: time="2026-01-21T18:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:08:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:08:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2149 "" "Go-http-client/1.1"
Jan 21 18:08:30 compute-0 podman[203000]: 2026-01-21 18:08:30.000297464 +0000 UTC m=+0.056014187 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 18:08:30 compute-0 podman[202999]: 2026-01-21 18:08:30.020670945 +0000 UTC m=+0.080527280 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:08:31 compute-0 openstack_network_exporter[195402]: ERROR   18:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:08:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:08:31 compute-0 openstack_network_exporter[195402]: ERROR   18:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:08:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:08:32 compute-0 podman[203041]: 2026-01-21 18:08:32.990765325 +0000 UTC m=+0.048228157 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:08:52 compute-0 podman[203065]: 2026-01-21 18:08:52.998684343 +0000 UTC m=+0.052645187 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Jan 21 18:08:59 compute-0 podman[192560]: time="2026-01-21T18:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:08:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:08:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2147 "" "Go-http-client/1.1"
Jan 21 18:09:00 compute-0 nova_compute[183278]: 2026-01-21 18:09:00.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:00 compute-0 nova_compute[183278]: 2026-01-21 18:09:00.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:09:00 compute-0 nova_compute[183278]: 2026-01-21 18:09:00.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:09:00 compute-0 nova_compute[183278]: 2026-01-21 18:09:00.891 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:09:01 compute-0 podman[203089]: 2026-01-21 18:09:01.004943306 +0000 UTC m=+0.060040339 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 21 18:09:01 compute-0 podman[203088]: 2026-01-21 18:09:01.072483289 +0000 UTC m=+0.132718630 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 18:09:01 compute-0 openstack_network_exporter[195402]: ERROR   18:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:09:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:09:01 compute-0 openstack_network_exporter[195402]: ERROR   18:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:09:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:09:01 compute-0 nova_compute[183278]: 2026-01-21 18:09:01.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:01 compute-0 nova_compute[183278]: 2026-01-21 18:09:01.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:01 compute-0 nova_compute[183278]: 2026-01-21 18:09:01.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:03 compute-0 nova_compute[183278]: 2026-01-21 18:09:03.735 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:09:03 compute-0 nova_compute[183278]: 2026-01-21 18:09:03.735 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:09:03 compute-0 nova_compute[183278]: 2026-01-21 18:09:03.735 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:09:03 compute-0 nova_compute[183278]: 2026-01-21 18:09:03.735 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:09:03 compute-0 nova_compute[183278]: 2026-01-21 18:09:03.872 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:09:03 compute-0 nova_compute[183278]: 2026-01-21 18:09:03.873 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6162MB free_disk=73.41806030273438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:09:03 compute-0 nova_compute[183278]: 2026-01-21 18:09:03.873 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:09:03 compute-0 nova_compute[183278]: 2026-01-21 18:09:03.873 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:09:03 compute-0 podman[203131]: 2026-01-21 18:09:03.983168661 +0000 UTC m=+0.044675509 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:09:04 compute-0 nova_compute[183278]: 2026-01-21 18:09:04.461 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:09:04 compute-0 nova_compute[183278]: 2026-01-21 18:09:04.462 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:09:04 compute-0 nova_compute[183278]: 2026-01-21 18:09:04.487 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:09:04 compute-0 nova_compute[183278]: 2026-01-21 18:09:04.552 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:09:04 compute-0 nova_compute[183278]: 2026-01-21 18:09:04.554 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:09:04 compute-0 nova_compute[183278]: 2026-01-21 18:09:04.555 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:09:05 compute-0 nova_compute[183278]: 2026-01-21 18:09:05.551 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:05 compute-0 nova_compute[183278]: 2026-01-21 18:09:05.653 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:05 compute-0 nova_compute[183278]: 2026-01-21 18:09:05.653 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:05 compute-0 nova_compute[183278]: 2026-01-21 18:09:05.654 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:05 compute-0 nova_compute[183278]: 2026-01-21 18:09:05.654 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:05 compute-0 nova_compute[183278]: 2026-01-21 18:09:05.654 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:09:05 compute-0 nova_compute[183278]: 2026-01-21 18:09:05.914 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:07 compute-0 sshd-session[203156]: Invalid user ansible_user from 64.227.98.100 port 53804
Jan 21 18:09:07 compute-0 sshd-session[203156]: Connection closed by invalid user ansible_user 64.227.98.100 port 53804 [preauth]
Jan 21 18:09:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:09:20.063 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:09:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:09:20.064 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:09:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:09:20.064 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:09:23 compute-0 podman[203158]: 2026-01-21 18:09:23.994400129 +0000 UTC m=+0.055531177 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc.)
Jan 21 18:09:29 compute-0 podman[192560]: time="2026-01-21T18:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:09:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:09:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2150 "" "Go-http-client/1.1"
Jan 21 18:09:31 compute-0 openstack_network_exporter[195402]: ERROR   18:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:09:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:09:31 compute-0 openstack_network_exporter[195402]: ERROR   18:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:09:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:09:32 compute-0 podman[203183]: 2026-01-21 18:09:32.003365369 +0000 UTC m=+0.053167458 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 18:09:32 compute-0 podman[203182]: 2026-01-21 18:09:32.029260042 +0000 UTC m=+0.075557004 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 21 18:09:35 compute-0 podman[203227]: 2026-01-21 18:09:35.005923408 +0000 UTC m=+0.067933226 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:09:55 compute-0 podman[203252]: 2026-01-21 18:09:55.038414216 +0000 UTC m=+0.081237498 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41)
Jan 21 18:09:59 compute-0 podman[192560]: time="2026-01-21T18:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:09:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:09:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2150 "" "Go-http-client/1.1"
Jan 21 18:09:59 compute-0 nova_compute[183278]: 2026-01-21 18:09:59.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:09:59 compute-0 nova_compute[183278]: 2026-01-21 18:09:59.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 18:10:00 compute-0 nova_compute[183278]: 2026-01-21 18:10:00.059 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 18:10:00 compute-0 nova_compute[183278]: 2026-01-21 18:10:00.061 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:00 compute-0 nova_compute[183278]: 2026-01-21 18:10:00.061 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 18:10:00 compute-0 nova_compute[183278]: 2026-01-21 18:10:00.499 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:01 compute-0 openstack_network_exporter[195402]: ERROR   18:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:10:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:10:01 compute-0 openstack_network_exporter[195402]: ERROR   18:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:10:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:10:02 compute-0 nova_compute[183278]: 2026-01-21 18:10:02.873 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:02 compute-0 nova_compute[183278]: 2026-01-21 18:10:02.873 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:10:02 compute-0 nova_compute[183278]: 2026-01-21 18:10:02.874 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:10:03 compute-0 podman[203274]: 2026-01-21 18:10:03.030601706 +0000 UTC m=+0.076895583 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 18:10:03 compute-0 podman[203273]: 2026-01-21 18:10:03.068445462 +0000 UTC m=+0.125365226 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.442 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.443 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.443 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.443 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.648 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.648 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.648 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.649 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.781 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.782 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6191MB free_disk=73.41806030273438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.782 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.783 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.990 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:10:03 compute-0 nova_compute[183278]: 2026-01-21 18:10:03.991 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.039 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing inventories for resource provider 502e4243-611b-433d-a766-9b485d51652d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.111 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating ProviderTree inventory for provider 502e4243-611b-433d-a766-9b485d51652d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.112 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.129 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing aggregate associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.151 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing trait associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.170 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.265 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.266 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.266 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.640 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.640 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.641 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:10:04 compute-0 nova_compute[183278]: 2026-01-21 18:10:04.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:05 compute-0 nova_compute[183278]: 2026-01-21 18:10:05.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:05 compute-0 podman[203319]: 2026-01-21 18:10:05.986224782 +0000 UTC m=+0.044102408 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:10:06 compute-0 nova_compute[183278]: 2026-01-21 18:10:06.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:10:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:10:20.064 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:10:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:10:20.065 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:10:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:10:20.065 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:10:20 compute-0 sshd-session[203343]: Invalid user ubuntu from 39.191.29.114 port 50422
Jan 21 18:10:21 compute-0 sshd-session[203343]: Received disconnect from 39.191.29.114 port 50422:11:  [preauth]
Jan 21 18:10:21 compute-0 sshd-session[203343]: Disconnected from invalid user ubuntu 39.191.29.114 port 50422 [preauth]
Jan 21 18:10:26 compute-0 podman[203345]: 2026-01-21 18:10:26.006285003 +0000 UTC m=+0.062925155 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 18:10:29 compute-0 podman[192560]: time="2026-01-21T18:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:10:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:10:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2146 "" "Go-http-client/1.1"
Jan 21 18:10:31 compute-0 openstack_network_exporter[195402]: ERROR   18:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:10:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:10:31 compute-0 openstack_network_exporter[195402]: ERROR   18:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:10:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:10:33 compute-0 podman[203368]: 2026-01-21 18:10:33.994086747 +0000 UTC m=+0.050958054 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:10:34 compute-0 podman[203367]: 2026-01-21 18:10:34.034373242 +0000 UTC m=+0.094262333 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 18:10:36 compute-0 podman[203413]: 2026-01-21 18:10:36.9912612 +0000 UTC m=+0.050245418 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:10:56 compute-0 podman[203437]: 2026-01-21 18:10:56.995161116 +0000 UTC m=+0.054933382 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Jan 21 18:10:59 compute-0 podman[192560]: time="2026-01-21T18:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:10:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:10:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2156 "" "Go-http-client/1.1"
Jan 21 18:11:01 compute-0 openstack_network_exporter[195402]: ERROR   18:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:11:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:11:01 compute-0 openstack_network_exporter[195402]: ERROR   18:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:11:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:11:01 compute-0 nova_compute[183278]: 2026-01-21 18:11:01.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:02 compute-0 nova_compute[183278]: 2026-01-21 18:11:02.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:03 compute-0 nova_compute[183278]: 2026-01-21 18:11:03.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:03 compute-0 nova_compute[183278]: 2026-01-21 18:11:03.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:11:03 compute-0 nova_compute[183278]: 2026-01-21 18:11:03.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:11:03 compute-0 nova_compute[183278]: 2026-01-21 18:11:03.984 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:11:03 compute-0 nova_compute[183278]: 2026-01-21 18:11:03.984 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.184 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.185 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.185 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.185 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.343 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.345 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6196MB free_disk=73.41793823242188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.345 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.346 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.405 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.406 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.431 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.448 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.450 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:11:04 compute-0 nova_compute[183278]: 2026-01-21 18:11:04.450 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:11:05 compute-0 podman[203460]: 2026-01-21 18:11:05.022517956 +0000 UTC m=+0.074047234 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:11:05 compute-0 podman[203459]: 2026-01-21 18:11:05.039420055 +0000 UTC m=+0.093402013 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 21 18:11:05 compute-0 nova_compute[183278]: 2026-01-21 18:11:05.446 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:05 compute-0 nova_compute[183278]: 2026-01-21 18:11:05.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:05 compute-0 nova_compute[183278]: 2026-01-21 18:11:05.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:05 compute-0 nova_compute[183278]: 2026-01-21 18:11:05.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:05 compute-0 nova_compute[183278]: 2026-01-21 18:11:05.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:11:06 compute-0 nova_compute[183278]: 2026-01-21 18:11:06.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:07 compute-0 podman[203504]: 2026-01-21 18:11:07.981252587 +0000 UTC m=+0.042274464 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:11:08 compute-0 nova_compute[183278]: 2026-01-21 18:11:08.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:11:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:11:20.065 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:11:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:11:20.065 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:11:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:11:20.065 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:11:20 compute-0 sshd-session[203529]: Invalid user abgar from 64.227.98.100 port 40348
Jan 21 18:11:20 compute-0 sshd-session[203529]: Connection closed by invalid user abgar 64.227.98.100 port 40348 [preauth]
Jan 21 18:11:27 compute-0 podman[203531]: 2026-01-21 18:11:27.991387216 +0000 UTC m=+0.049153122 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, release=1755695350, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Jan 21 18:11:29 compute-0 podman[192560]: time="2026-01-21T18:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:11:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:11:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Jan 21 18:11:31 compute-0 openstack_network_exporter[195402]: ERROR   18:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:11:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:11:31 compute-0 openstack_network_exporter[195402]: ERROR   18:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:11:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:11:36 compute-0 podman[203553]: 2026-01-21 18:11:36.000717222 +0000 UTC m=+0.052939903 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 21 18:11:36 compute-0 podman[203552]: 2026-01-21 18:11:36.048440797 +0000 UTC m=+0.108379205 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:11:38 compute-0 podman[203594]: 2026-01-21 18:11:38.994539683 +0000 UTC m=+0.049096699 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:11:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:11:54.549 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:11:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:11:54.549 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:11:55 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:11:55.552 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:11:58 compute-0 podman[203618]: 2026-01-21 18:11:58.987189979 +0000 UTC m=+0.044841024 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 21 18:11:59 compute-0 podman[192560]: time="2026-01-21T18:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:11:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:11:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2149 "" "Go-http-client/1.1"
Jan 21 18:12:01 compute-0 openstack_network_exporter[195402]: ERROR   18:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:12:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:12:01 compute-0 openstack_network_exporter[195402]: ERROR   18:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:12:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.873 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.874 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.874 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.919 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.919 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.919 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:03 compute-0 nova_compute[183278]: 2026-01-21 18:12:03.919 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.049 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.050 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6188MB free_disk=73.41793823242188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.050 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.050 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.102 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.102 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.125 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.140 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.141 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:12:04 compute-0 nova_compute[183278]: 2026-01-21 18:12:04.142 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:05 compute-0 nova_compute[183278]: 2026-01-21 18:12:05.084 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:12:06 compute-0 nova_compute[183278]: 2026-01-21 18:12:06.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:12:06 compute-0 nova_compute[183278]: 2026-01-21 18:12:06.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:12:06 compute-0 nova_compute[183278]: 2026-01-21 18:12:06.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:12:06 compute-0 nova_compute[183278]: 2026-01-21 18:12:06.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:12:06 compute-0 nova_compute[183278]: 2026-01-21 18:12:06.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:12:07 compute-0 podman[203640]: 2026-01-21 18:12:07.029593035 +0000 UTC m=+0.079632075 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:12:07 compute-0 podman[203639]: 2026-01-21 18:12:07.046585345 +0000 UTC m=+0.099046683 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:12:07 compute-0 rsyslogd[1002]: imjournal: 249 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 21 18:12:09 compute-0 nova_compute[183278]: 2026-01-21 18:12:09.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:12:09 compute-0 podman[203683]: 2026-01-21 18:12:09.982230217 +0000 UTC m=+0.042385974 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:12:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:20.067 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:20.068 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:20.068 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:29 compute-0 podman[192560]: time="2026-01-21T18:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:12:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:12:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2156 "" "Go-http-client/1.1"
Jan 21 18:12:30 compute-0 podman[203707]: 2026-01-21 18:12:30.027819517 +0000 UTC m=+0.084790090 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Jan 21 18:12:31 compute-0 openstack_network_exporter[195402]: ERROR   18:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:12:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:12:31 compute-0 openstack_network_exporter[195402]: ERROR   18:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:12:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.246 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.246 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.268 183284 DEBUG nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.365 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.366 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.372 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.373 183284 INFO nova.compute.claims [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.475 183284 DEBUG nova.compute.provider_tree [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.489 183284 DEBUG nova.scheduler.client.report [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.510 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.511 183284 DEBUG nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.551 183284 DEBUG nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.552 183284 DEBUG nova.network.neutron [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.577 183284 INFO nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.595 183284 DEBUG nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.690 183284 DEBUG nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.691 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.692 183284 INFO nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Creating image(s)
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.693 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.693 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.694 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.695 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:34 compute-0 nova_compute[183278]: 2026-01-21 18:12:34.695 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:35 compute-0 nova_compute[183278]: 2026-01-21 18:12:35.455 183284 WARNING oslo_policy.policy [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 21 18:12:35 compute-0 nova_compute[183278]: 2026-01-21 18:12:35.456 183284 WARNING oslo_policy.policy [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 21 18:12:35 compute-0 nova_compute[183278]: 2026-01-21 18:12:35.458 183284 DEBUG nova.policy [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16f8ab2ae83b48f9a88753a5deddcc19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.131 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.184 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685.part --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.185 183284 DEBUG nova.virt.images [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] 672306ae-5521-4fc1-a825-a16d6d125c61 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.186 183284 DEBUG nova.privsep.utils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.187 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685.part /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.382 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685.part /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685.converted" returned: 0 in 0.195s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.386 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.468 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685.converted --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.469 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:36 compute-0 nova_compute[183278]: 2026-01-21 18:12:36.480 183284 INFO oslo.privsep.daemon [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpfx91jx52/privsep.sock']
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.146 183284 INFO oslo.privsep.daemon [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Spawned new privsep daemon via rootwrap
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.031 203750 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.034 203750 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.036 203750 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.036 203750 INFO oslo.privsep.daemon [-] privsep daemon running as pid 203750
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.234 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.284 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.285 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.286 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.296 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.345 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.346 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.376 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.378 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.379 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.428 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.429 183284 DEBUG nova.virt.disk.api [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Checking if we can resize image /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.429 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.477 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.479 183284 DEBUG nova.virt.disk.api [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Cannot resize image /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.479 183284 DEBUG nova.objects.instance [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'migration_context' on Instance uuid 841e0bef-3987-412a-805b-b71e87fa2a74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.544 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.545 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Ensure instance console log exists: /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.545 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.546 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.546 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:37 compute-0 nova_compute[183278]: 2026-01-21 18:12:37.757 183284 DEBUG nova.network.neutron [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Successfully created port: fa0544b4-ca76-47d6-a911-a353d9e6095f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:12:37 compute-0 podman[203768]: 2026-01-21 18:12:37.992429852 +0000 UTC m=+0.050318136 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:12:38 compute-0 podman[203767]: 2026-01-21 18:12:38.036267172 +0000 UTC m=+0.088438438 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 18:12:39 compute-0 nova_compute[183278]: 2026-01-21 18:12:39.440 183284 DEBUG nova.network.neutron [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Successfully updated port: fa0544b4-ca76-47d6-a911-a353d9e6095f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:12:39 compute-0 nova_compute[183278]: 2026-01-21 18:12:39.462 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:12:39 compute-0 nova_compute[183278]: 2026-01-21 18:12:39.462 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquired lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:12:39 compute-0 nova_compute[183278]: 2026-01-21 18:12:39.462 183284 DEBUG nova.network.neutron [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:12:39 compute-0 nova_compute[183278]: 2026-01-21 18:12:39.638 183284 DEBUG nova.network.neutron [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:12:39 compute-0 nova_compute[183278]: 2026-01-21 18:12:39.911 183284 DEBUG nova.compute.manager [req-f950f5e4-e593-4b72-874d-bb0a2b9ea8be req-afc6549f-3b7c-4731-874a-f99fe3cc36ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received event network-changed-fa0544b4-ca76-47d6-a911-a353d9e6095f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:12:39 compute-0 nova_compute[183278]: 2026-01-21 18:12:39.912 183284 DEBUG nova.compute.manager [req-f950f5e4-e593-4b72-874d-bb0a2b9ea8be req-afc6549f-3b7c-4731-874a-f99fe3cc36ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Refreshing instance network info cache due to event network-changed-fa0544b4-ca76-47d6-a911-a353d9e6095f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:12:39 compute-0 nova_compute[183278]: 2026-01-21 18:12:39.912 183284 DEBUG oslo_concurrency.lockutils [req-f950f5e4-e593-4b72-874d-bb0a2b9ea8be req-afc6549f-3b7c-4731-874a-f99fe3cc36ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:12:40 compute-0 podman[203812]: 2026-01-21 18:12:40.989589531 +0000 UTC m=+0.051094816 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.644 183284 DEBUG nova.network.neutron [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Updating instance_info_cache with network_info: [{"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.759 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Releasing lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.759 183284 DEBUG nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Instance network_info: |[{"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.760 183284 DEBUG oslo_concurrency.lockutils [req-f950f5e4-e593-4b72-874d-bb0a2b9ea8be req-afc6549f-3b7c-4731-874a-f99fe3cc36ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.760 183284 DEBUG nova.network.neutron [req-f950f5e4-e593-4b72-874d-bb0a2b9ea8be req-afc6549f-3b7c-4731-874a-f99fe3cc36ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Refreshing network info cache for port fa0544b4-ca76-47d6-a911-a353d9e6095f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.764 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Start _get_guest_xml network_info=[{"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.768 183284 WARNING nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.773 183284 DEBUG nova.virt.libvirt.host [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.774 183284 DEBUG nova.virt.libvirt.host [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.776 183284 DEBUG nova.virt.libvirt.host [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.777 183284 DEBUG nova.virt.libvirt.host [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.778 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.778 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.779 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.779 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.779 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.779 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.780 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.780 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.780 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.780 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.781 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.781 183284 DEBUG nova.virt.hardware [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.784 183284 DEBUG nova.privsep.utils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.785 183284 DEBUG nova.virt.libvirt.vif [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:12:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-215034302',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-215034302',id=1,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-gzrkbma3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:12:34Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=841e0bef-3987-412a-805b-b71e87fa2a74,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.785 183284 DEBUG nova.network.os_vif_util [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.786 183284 DEBUG nova.network.os_vif_util [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.787 183284 DEBUG nova.objects.instance [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'pci_devices' on Instance uuid 841e0bef-3987-412a-805b-b71e87fa2a74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.968 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <uuid>841e0bef-3987-412a-805b-b71e87fa2a74</uuid>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <name>instance-00000001</name>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-215034302</nova:name>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:12:41</nova:creationTime>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:12:41 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:12:41 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:12:41 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:12:41 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:12:41 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:12:41 compute-0 nova_compute[183278]:         <nova:user uuid="16f8ab2ae83b48f9a88753a5deddcc19">tempest-TestExecuteActionsViaActuator-627352265-project-member</nova:user>
Jan 21 18:12:41 compute-0 nova_compute[183278]:         <nova:project uuid="2a4b7cdf556d4f8393d1c61b57628813">tempest-TestExecuteActionsViaActuator-627352265</nova:project>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:12:41 compute-0 nova_compute[183278]:         <nova:port uuid="fa0544b4-ca76-47d6-a911-a353d9e6095f">
Jan 21 18:12:41 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <system>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <entry name="serial">841e0bef-3987-412a-805b-b71e87fa2a74</entry>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <entry name="uuid">841e0bef-3987-412a-805b-b71e87fa2a74</entry>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     </system>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <os>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   </os>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <features>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   </features>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.config"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:ac:65:b9"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <target dev="tapfa0544b4-ca"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/console.log" append="off"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <video>
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     </video>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:12:41 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:12:41 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:12:41 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:12:41 compute-0 nova_compute[183278]: </domain>
Jan 21 18:12:41 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.970 183284 DEBUG nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Preparing to wait for external event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.970 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.970 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.971 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.971 183284 DEBUG nova.virt.libvirt.vif [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:12:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-215034302',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-215034302',id=1,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-gzrkbma3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:12:34Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=841e0bef-3987-412a-805b-b71e87fa2a74,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.971 183284 DEBUG nova.network.os_vif_util [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.972 183284 DEBUG nova.network.os_vif_util [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:12:41 compute-0 nova_compute[183278]: 2026-01-21 18:12:41.972 183284 DEBUG os_vif [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.007 183284 DEBUG ovsdbapp.backend.ovs_idl [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.007 183284 DEBUG ovsdbapp.backend.ovs_idl [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.007 183284 DEBUG ovsdbapp.backend.ovs_idl [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.008 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.009 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.009 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.009 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.011 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.013 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.022 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.022 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.022 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.023 183284 INFO oslo.privsep.daemon [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpqx28m1r6/privsep.sock']
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.671 183284 INFO oslo.privsep.daemon [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Spawned new privsep daemon via rootwrap
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.551 203841 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.557 203841 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.559 203841 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.559 203841 INFO oslo.privsep.daemon [-] privsep daemon running as pid 203841
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.964 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.964 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa0544b4-ca, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.965 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfa0544b4-ca, col_values=(('external_ids', {'iface-id': 'fa0544b4-ca76-47d6-a911-a353d9e6095f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:65:b9', 'vm-uuid': '841e0bef-3987-412a-805b-b71e87fa2a74'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.966 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:42 compute-0 NetworkManager[55506]: <info>  [1769019162.9673] manager: (tapfa0544b4-ca): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.970 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.972 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:42 compute-0 nova_compute[183278]: 2026-01-21 18:12:42.973 183284 INFO os_vif [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca')
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.041 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.042 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.042 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No VIF found with MAC fa:16:3e:ac:65:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.042 183284 INFO nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Using config drive
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.709 183284 DEBUG nova.network.neutron [req-f950f5e4-e593-4b72-874d-bb0a2b9ea8be req-afc6549f-3b7c-4731-874a-f99fe3cc36ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Updated VIF entry in instance network info cache for port fa0544b4-ca76-47d6-a911-a353d9e6095f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.710 183284 DEBUG nova.network.neutron [req-f950f5e4-e593-4b72-874d-bb0a2b9ea8be req-afc6549f-3b7c-4731-874a-f99fe3cc36ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Updating instance_info_cache with network_info: [{"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.725 183284 DEBUG oslo_concurrency.lockutils [req-f950f5e4-e593-4b72-874d-bb0a2b9ea8be req-afc6549f-3b7c-4731-874a-f99fe3cc36ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.743 183284 INFO nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Creating config drive at /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.config
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.748 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpekmls43i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.870 183284 DEBUG oslo_concurrency.processutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpekmls43i" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:12:43 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 21 18:12:43 compute-0 kernel: tapfa0544b4-ca: entered promiscuous mode
Jan 21 18:12:43 compute-0 ovn_controller[95419]: 2026-01-21T18:12:43Z|00027|binding|INFO|Claiming lport fa0544b4-ca76-47d6-a911-a353d9e6095f for this chassis.
Jan 21 18:12:43 compute-0 ovn_controller[95419]: 2026-01-21T18:12:43Z|00028|binding|INFO|fa0544b4-ca76-47d6-a911-a353d9e6095f: Claiming fa:16:3e:ac:65:b9 10.100.0.5
Jan 21 18:12:43 compute-0 NetworkManager[55506]: <info>  [1769019163.9339] manager: (tapfa0544b4-ca): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.934 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.936 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:43.946 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:65:b9 10.100.0.5'], port_security=['fa:16:3e:ac:65:b9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '841e0bef-3987-412a-805b-b71e87fa2a74', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8161199-7513-4099-89c4-00e7e075c92b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c5ea7560-106a-40fd-a00a-355d8be6545e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afee9644-a390-49fb-b346-3fd1c948feef, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=fa0544b4-ca76-47d6-a911-a353d9e6095f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:12:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:43.947 104698 INFO neutron.agent.ovn.metadata.agent [-] Port fa0544b4-ca76-47d6-a911-a353d9e6095f in datapath e8161199-7513-4099-89c4-00e7e075c92b bound to our chassis
Jan 21 18:12:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:43.949 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:12:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:43.950 104698 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpw16nnrdp/privsep.sock']
Jan 21 18:12:43 compute-0 systemd-udevd[203866]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:12:43 compute-0 NetworkManager[55506]: <info>  [1769019163.9670] device (tapfa0544b4-ca): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:12:43 compute-0 NetworkManager[55506]: <info>  [1769019163.9683] device (tapfa0544b4-ca): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:12:43 compute-0 systemd-machined[154592]: New machine qemu-1-instance-00000001.
Jan 21 18:12:43 compute-0 nova_compute[183278]: 2026-01-21 18:12:43.997 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:44 compute-0 ovn_controller[95419]: 2026-01-21T18:12:44Z|00029|binding|INFO|Setting lport fa0544b4-ca76-47d6-a911-a353d9e6095f ovn-installed in OVS
Jan 21 18:12:44 compute-0 ovn_controller[95419]: 2026-01-21T18:12:44Z|00030|binding|INFO|Setting lport fa0544b4-ca76-47d6-a911-a353d9e6095f up in Southbound
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.004 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:44 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.077 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.416 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019164.4162943, 841e0bef-3987-412a-805b-b71e87fa2a74 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.417 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] VM Started (Lifecycle Event)
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.447 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.450 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019164.4163938, 841e0bef-3987-412a-805b-b71e87fa2a74 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.451 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] VM Paused (Lifecycle Event)
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.469 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.472 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.494 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:12:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:44.589 104698 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 21 18:12:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:44.590 104698 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpw16nnrdp/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 21 18:12:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:44.473 203892 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 18:12:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:44.477 203892 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 18:12:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:44.479 203892 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 21 18:12:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:44.479 203892 INFO oslo.privsep.daemon [-] privsep daemon running as pid 203892
Jan 21 18:12:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:44.592 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[233b97da-d9bc-42d1-9169-adcb541234f2]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.926 183284 DEBUG nova.compute.manager [req-e289fe3a-aad6-415c-9029-4cdb77a196cd req-8a2eb402-681f-46ef-96a3-327b1280a2fe 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.927 183284 DEBUG oslo_concurrency.lockutils [req-e289fe3a-aad6-415c-9029-4cdb77a196cd req-8a2eb402-681f-46ef-96a3-327b1280a2fe 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.928 183284 DEBUG oslo_concurrency.lockutils [req-e289fe3a-aad6-415c-9029-4cdb77a196cd req-8a2eb402-681f-46ef-96a3-327b1280a2fe 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.928 183284 DEBUG oslo_concurrency.lockutils [req-e289fe3a-aad6-415c-9029-4cdb77a196cd req-8a2eb402-681f-46ef-96a3-327b1280a2fe 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.929 183284 DEBUG nova.compute.manager [req-e289fe3a-aad6-415c-9029-4cdb77a196cd req-8a2eb402-681f-46ef-96a3-327b1280a2fe 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Processing event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.930 183284 DEBUG nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.933 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019164.9330218, 841e0bef-3987-412a-805b-b71e87fa2a74 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.933 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] VM Resumed (Lifecycle Event)
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.935 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.938 183284 INFO nova.virt.libvirt.driver [-] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Instance spawned successfully.
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.939 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.951 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.959 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.964 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.964 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.965 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.966 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.966 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.967 183284 DEBUG nova.virt.libvirt.driver [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:12:44 compute-0 nova_compute[183278]: 2026-01-21 18:12:44.976 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:12:45 compute-0 nova_compute[183278]: 2026-01-21 18:12:45.029 183284 INFO nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Took 10.34 seconds to spawn the instance on the hypervisor.
Jan 21 18:12:45 compute-0 nova_compute[183278]: 2026-01-21 18:12:45.030 183284 DEBUG nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.081 203892 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.081 203892 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.081 203892 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:45 compute-0 nova_compute[183278]: 2026-01-21 18:12:45.107 183284 INFO nova.compute.manager [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Took 10.77 seconds to build instance.
Jan 21 18:12:45 compute-0 nova_compute[183278]: 2026-01-21 18:12:45.252 183284 DEBUG oslo_concurrency.lockutils [None req-d20816eb-f488-4df4-a92c-50ea09bcd42c 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.731 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[bde3141a-a2a6-4e13-b1f6-4198cf049c55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.733 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape8161199-71 in ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.736 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape8161199-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.736 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[749b2631-f84b-466d-8684-5c0a357e612a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.739 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[da2701b1-5ff8-4dc3-bcc3-9f88ef37aeb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.763 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0d40d5-b4b1-44b8-97b5-a5063ad8ec1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.786 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[57427987-8d41-4363-8abb-ee1e252de21c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:45 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:45.788 104698 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpp0d4gacu/privsep.sock']
Jan 21 18:12:46 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:46.534 104698 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 21 18:12:46 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:46.535 104698 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpp0d4gacu/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 21 18:12:46 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:46.344 203906 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 18:12:46 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:46.349 203906 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 18:12:46 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:46.351 203906 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 21 18:12:46 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:46.351 203906 INFO oslo.privsep.daemon [-] privsep daemon running as pid 203906
Jan 21 18:12:46 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:46.537 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[f80b5e76-49d9-4057-817d-7af7e7a8a6d2]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.048 183284 DEBUG nova.compute.manager [req-88dc1d59-ed4e-454a-81ed-82db15a8c746 req-9774e9bc-3170-4b2a-bfd0-6608122ba857 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.049 183284 DEBUG oslo_concurrency.lockutils [req-88dc1d59-ed4e-454a-81ed-82db15a8c746 req-9774e9bc-3170-4b2a-bfd0-6608122ba857 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.050 183284 DEBUG oslo_concurrency.lockutils [req-88dc1d59-ed4e-454a-81ed-82db15a8c746 req-9774e9bc-3170-4b2a-bfd0-6608122ba857 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.050 183284 DEBUG oslo_concurrency.lockutils [req-88dc1d59-ed4e-454a-81ed-82db15a8c746 req-9774e9bc-3170-4b2a-bfd0-6608122ba857 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.050 183284 DEBUG nova.compute.manager [req-88dc1d59-ed4e-454a-81ed-82db15a8c746 req-9774e9bc-3170-4b2a-bfd0-6608122ba857 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] No waiting events found dispatching network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.050 183284 WARNING nova.compute.manager [req-88dc1d59-ed4e-454a-81ed-82db15a8c746 req-9774e9bc-3170-4b2a-bfd0-6608122ba857 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received unexpected event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f for instance with vm_state active and task_state None.
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.106 203906 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.107 203906 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.107 203906 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.731 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[76589aea-d87b-4af3-8498-057243e3a2d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.749 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[102d3c41-b9f8-4ce9-a3b5-fda3f538588f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 NetworkManager[55506]: <info>  [1769019167.7516] manager: (tape8161199-70): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Jan 21 18:12:47 compute-0 systemd-udevd[203918]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.781 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[49e34147-3e79-4608-abb7-f53f863ad09e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.784 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[31948031-771c-44dc-a2d9-64d47d4c96f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 NetworkManager[55506]: <info>  [1769019167.8101] device (tape8161199-70): carrier: link connected
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.815 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0b69b0-dd90-434a-8d9d-24ed6664ae2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.833 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ef7d72-cf92-44f3-b1f2-b06426e8030e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8161199-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:ce:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373141, 'reachable_time': 20037, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 203936, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.849 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4efa47-230d-4aa7-bf25-156b16a4deac]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:ce84'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 373141, 'tstamp': 373141}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 203937, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.866 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9d9ba34e-3bea-4777-b904-27b6d52d1805]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8161199-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:ce:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373141, 'reachable_time': 20037, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 203938, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.895 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[69f75c90-e675-4dfb-8aad-a80eb111d40e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.951 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[973de855-fe2a-4acb-858b-330b69f78d54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.952 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8161199-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.953 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.953 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8161199-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:12:47 compute-0 NetworkManager[55506]: <info>  [1769019167.9554] manager: (tape8161199-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 21 18:12:47 compute-0 kernel: tape8161199-70: entered promiscuous mode
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.955 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.960 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape8161199-70, col_values=(('external_ids', {'iface-id': 'd5993779-4a27-48a2-a904-ec457f58cb35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.961 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:47 compute-0 ovn_controller[95419]: 2026-01-21T18:12:47Z|00031|binding|INFO|Releasing lport d5993779-4a27-48a2-a904-ec457f58cb35 from this chassis (sb_readonly=0)
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.965 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e8161199-7513-4099-89c4-00e7e075c92b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e8161199-7513-4099-89c4-00e7e075c92b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.966 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.970 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[364fd664-c152-49ae-a777-41af492b3dc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:12:47 compute-0 nova_compute[183278]: 2026-01-21 18:12:47.972 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.972 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/e8161199-7513-4099-89c4-00e7e075c92b.pid.haproxy
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:12:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:47.975 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'env', 'PROCESS_TAG=haproxy-e8161199-7513-4099-89c4-00e7e075c92b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e8161199-7513-4099-89c4-00e7e075c92b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:12:48 compute-0 podman[203971]: 2026-01-21 18:12:48.347279975 +0000 UTC m=+0.047628602 container create eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 21 18:12:48 compute-0 systemd[1]: Started libpod-conmon-eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004.scope.
Jan 21 18:12:48 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:12:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56c7ba45ce6762c2026e89103e7e455758fd70f1949cbb35cc0d90df4e2481c2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:12:48 compute-0 podman[203971]: 2026-01-21 18:12:48.319207907 +0000 UTC m=+0.019556564 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:12:48 compute-0 podman[203971]: 2026-01-21 18:12:48.425708919 +0000 UTC m=+0.126057566 container init eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:12:48 compute-0 podman[203971]: 2026-01-21 18:12:48.430834874 +0000 UTC m=+0.131183501 container start eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:12:48 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[203988]: [NOTICE]   (203992) : New worker (203994) forked
Jan 21 18:12:48 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[203988]: [NOTICE]   (203992) : Loading success.
Jan 21 18:12:49 compute-0 nova_compute[183278]: 2026-01-21 18:12:49.079 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:52 compute-0 nova_compute[183278]: 2026-01-21 18:12:52.967 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:54 compute-0 nova_compute[183278]: 2026-01-21 18:12:54.081 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:55 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:55.718 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:12:55 compute-0 nova_compute[183278]: 2026-01-21 18:12:55.719 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:55 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:55.720 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:12:55 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:12:55.721 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:12:57 compute-0 nova_compute[183278]: 2026-01-21 18:12:57.968 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:58 compute-0 ovn_controller[95419]: 2026-01-21T18:12:58Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:65:b9 10.100.0.5
Jan 21 18:12:58 compute-0 ovn_controller[95419]: 2026-01-21T18:12:58Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:65:b9 10.100.0.5
Jan 21 18:12:59 compute-0 nova_compute[183278]: 2026-01-21 18:12:59.082 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:12:59 compute-0 podman[192560]: time="2026-01-21T18:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:12:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:12:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2624 "" "Go-http-client/1.1"
Jan 21 18:13:00 compute-0 podman[204022]: 2026-01-21 18:13:00.998978255 +0000 UTC m=+0.054244682 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 18:13:01 compute-0 openstack_network_exporter[195402]: ERROR   18:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:13:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:13:01 compute-0 openstack_network_exporter[195402]: ERROR   18:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:13:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:13:02 compute-0 nova_compute[183278]: 2026-01-21 18:13:02.980 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:03 compute-0 nova_compute[183278]: 2026-01-21 18:13:03.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:04 compute-0 nova_compute[183278]: 2026-01-21 18:13:04.104 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:04 compute-0 nova_compute[183278]: 2026-01-21 18:13:04.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:04 compute-0 nova_compute[183278]: 2026-01-21 18:13:04.878 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:04 compute-0 nova_compute[183278]: 2026-01-21 18:13:04.879 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:13:04 compute-0 nova_compute[183278]: 2026-01-21 18:13:04.879 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:13:05 compute-0 nova_compute[183278]: 2026-01-21 18:13:05.030 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:13:05 compute-0 nova_compute[183278]: 2026-01-21 18:13:05.030 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:13:05 compute-0 nova_compute[183278]: 2026-01-21 18:13:05.030 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:13:05 compute-0 nova_compute[183278]: 2026-01-21 18:13:05.031 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid 841e0bef-3987-412a-805b-b71e87fa2a74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:13:07 compute-0 nova_compute[183278]: 2026-01-21 18:13:07.983 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.436 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Updating instance_info_cache with network_info: [{"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.458 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.458 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.458 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.459 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.459 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.485 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.486 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.486 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.486 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.552 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:08 compute-0 podman[204046]: 2026-01-21 18:13:08.657475642 +0000 UTC m=+0.125353697 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.659 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.660 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:08 compute-0 podman[204045]: 2026-01-21 18:13:08.677523577 +0000 UTC m=+0.147473522 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.717 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.866 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.868 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.35476303100586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.868 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.868 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.954 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance 841e0bef-3987-412a-805b-b71e87fa2a74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.954 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.954 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:13:08 compute-0 nova_compute[183278]: 2026-01-21 18:13:08.990 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.024 183284 ERROR nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [req-bee3af78-8163-47b2-92f1-3ef7c8e42e36] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 502e4243-611b-433d-a766-9b485d51652d.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-bee3af78-8163-47b2-92f1-3ef7c8e42e36"}]}
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.035 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing inventories for resource provider 502e4243-611b-433d-a766-9b485d51652d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.056 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating ProviderTree inventory for provider 502e4243-611b-433d-a766-9b485d51652d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.056 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.068 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing aggregate associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.099 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing trait associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.107 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.134 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.172 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updated inventory for provider 502e4243-611b-433d-a766-9b485d51652d with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.172 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating resource provider 502e4243-611b-433d-a766-9b485d51652d generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.173 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.196 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.197 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.554 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.554 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.555 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.555 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:13:09 compute-0 nova_compute[183278]: 2026-01-21 18:13:09.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:13:12 compute-0 podman[204098]: 2026-01-21 18:13:12.007278621 +0000 UTC m=+0.051157096 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:13:12 compute-0 nova_compute[183278]: 2026-01-21 18:13:12.986 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:13 compute-0 nova_compute[183278]: 2026-01-21 18:13:13.215 183284 DEBUG oslo_concurrency.lockutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:13:13 compute-0 nova_compute[183278]: 2026-01-21 18:13:13.215 183284 DEBUG oslo_concurrency.lockutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:13:13 compute-0 nova_compute[183278]: 2026-01-21 18:13:13.215 183284 DEBUG nova.network.neutron [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:13:14 compute-0 nova_compute[183278]: 2026-01-21 18:13:14.109 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:14 compute-0 nova_compute[183278]: 2026-01-21 18:13:14.872 183284 DEBUG nova.network.neutron [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Updating instance_info_cache with network_info: [{"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:13:14 compute-0 nova_compute[183278]: 2026-01-21 18:13:14.901 183284 DEBUG oslo_concurrency.lockutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:13:14 compute-0 nova_compute[183278]: 2026-01-21 18:13:14.995 183284 DEBUG nova.virt.libvirt.driver [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 21 18:13:14 compute-0 nova_compute[183278]: 2026-01-21 18:13:14.996 183284 DEBUG nova.virt.libvirt.volume.remotefs [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Creating file /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/e1d56ab1b7524f54a2b021fbc30c402c.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 21 18:13:14 compute-0 nova_compute[183278]: 2026-01-21 18:13:14.996 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/e1d56ab1b7524f54a2b021fbc30c402c.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:15 compute-0 nova_compute[183278]: 2026-01-21 18:13:15.399 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/e1d56ab1b7524f54a2b021fbc30c402c.tmp" returned: 1 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:15 compute-0 nova_compute[183278]: 2026-01-21 18:13:15.400 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/e1d56ab1b7524f54a2b021fbc30c402c.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 18:13:15 compute-0 nova_compute[183278]: 2026-01-21 18:13:15.401 183284 DEBUG nova.virt.libvirt.volume.remotefs [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Creating directory /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 21 18:13:15 compute-0 nova_compute[183278]: 2026-01-21 18:13:15.401 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:15 compute-0 nova_compute[183278]: 2026-01-21 18:13:15.599 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:15 compute-0 nova_compute[183278]: 2026-01-21 18:13:15.604 183284 DEBUG nova.virt.libvirt.driver [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 18:13:17 compute-0 nova_compute[183278]: 2026-01-21 18:13:17.988 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:18 compute-0 kernel: tapfa0544b4-ca (unregistering): left promiscuous mode
Jan 21 18:13:18 compute-0 NetworkManager[55506]: <info>  [1769019198.1494] device (tapfa0544b4-ca): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.156 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:18 compute-0 ovn_controller[95419]: 2026-01-21T18:13:18Z|00032|binding|INFO|Releasing lport fa0544b4-ca76-47d6-a911-a353d9e6095f from this chassis (sb_readonly=0)
Jan 21 18:13:18 compute-0 ovn_controller[95419]: 2026-01-21T18:13:18Z|00033|binding|INFO|Setting lport fa0544b4-ca76-47d6-a911-a353d9e6095f down in Southbound
Jan 21 18:13:18 compute-0 ovn_controller[95419]: 2026-01-21T18:13:18Z|00034|binding|INFO|Removing iface tapfa0544b4-ca ovn-installed in OVS
Jan 21 18:13:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:18.166 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:65:b9 10.100.0.5'], port_security=['fa:16:3e:ac:65:b9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '841e0bef-3987-412a-805b-b71e87fa2a74', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8161199-7513-4099-89c4-00e7e075c92b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c5ea7560-106a-40fd-a00a-355d8be6545e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afee9644-a390-49fb-b346-3fd1c948feef, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=fa0544b4-ca76-47d6-a911-a353d9e6095f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:13:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:18.167 104698 INFO neutron.agent.ovn.metadata.agent [-] Port fa0544b4-ca76-47d6-a911-a353d9e6095f in datapath e8161199-7513-4099-89c4-00e7e075c92b unbound from our chassis
Jan 21 18:13:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:18.168 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8161199-7513-4099-89c4-00e7e075c92b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:13:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:18.169 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[851834e6-1ab2-4ff6-bb88-ab79b8012659]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:18.170 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b namespace which is not needed anymore
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.172 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:18 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 21 18:13:18 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 13.342s CPU time.
Jan 21 18:13:18 compute-0 systemd-machined[154592]: Machine qemu-1-instance-00000001 terminated.
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.394 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.399 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.608 183284 DEBUG nova.compute.manager [req-7a0e6b22-6345-4805-b5f4-9a9a371a3e3f req-e88fda89-c466-4d48-ab0d-630c897601d7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received event network-vif-unplugged-fa0544b4-ca76-47d6-a911-a353d9e6095f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.608 183284 DEBUG oslo_concurrency.lockutils [req-7a0e6b22-6345-4805-b5f4-9a9a371a3e3f req-e88fda89-c466-4d48-ab0d-630c897601d7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.608 183284 DEBUG oslo_concurrency.lockutils [req-7a0e6b22-6345-4805-b5f4-9a9a371a3e3f req-e88fda89-c466-4d48-ab0d-630c897601d7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.609 183284 DEBUG oslo_concurrency.lockutils [req-7a0e6b22-6345-4805-b5f4-9a9a371a3e3f req-e88fda89-c466-4d48-ab0d-630c897601d7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.609 183284 DEBUG nova.compute.manager [req-7a0e6b22-6345-4805-b5f4-9a9a371a3e3f req-e88fda89-c466-4d48-ab0d-630c897601d7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] No waiting events found dispatching network-vif-unplugged-fa0544b4-ca76-47d6-a911-a353d9e6095f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.609 183284 WARNING nova.compute.manager [req-7a0e6b22-6345-4805-b5f4-9a9a371a3e3f req-e88fda89-c466-4d48-ab0d-630c897601d7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received unexpected event network-vif-unplugged-fa0544b4-ca76-47d6-a911-a353d9e6095f for instance with vm_state active and task_state resize_migrating.
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.621 183284 INFO nova.virt.libvirt.driver [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Instance shutdown successfully after 3 seconds.
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.626 183284 INFO nova.virt.libvirt.driver [-] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Instance destroyed successfully.
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.626 183284 DEBUG nova.virt.libvirt.vif [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:12:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-215034302',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-215034302',id=1,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:12:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-gzrkbma3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:13:12Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=841e0bef-3987-412a-805b-b71e87fa2a74,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "vif_mac": "fa:16:3e:ac:65:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.627 183284 DEBUG nova.network.os_vif_util [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "vif_mac": "fa:16:3e:ac:65:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.627 183284 DEBUG nova.network.os_vif_util [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.628 183284 DEBUG os_vif [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.629 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.630 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa0544b4-ca, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.632 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.633 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.635 183284 INFO os_vif [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca')
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.639 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:18 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[203988]: [NOTICE]   (203992) : haproxy version is 2.8.14-c23fe91
Jan 21 18:13:18 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[203988]: [NOTICE]   (203992) : path to executable is /usr/sbin/haproxy
Jan 21 18:13:18 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[203988]: [WARNING]  (203992) : Exiting Master process...
Jan 21 18:13:18 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[203988]: [WARNING]  (203992) : Exiting Master process...
Jan 21 18:13:18 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[203988]: [ALERT]    (203992) : Current worker (203994) exited with code 143 (Terminated)
Jan 21 18:13:18 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[203988]: [WARNING]  (203992) : All workers exited. Exiting... (0)
Jan 21 18:13:18 compute-0 systemd[1]: libpod-eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004.scope: Deactivated successfully.
Jan 21 18:13:18 compute-0 podman[204151]: 2026-01-21 18:13:18.666412877 +0000 UTC m=+0.417234621 container died eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.702 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.703 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.755 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.758 183284 DEBUG nova.virt.libvirt.volume.remotefs [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Copying file /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74_resize/disk to 192.168.122.101:/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 18:13:18 compute-0 nova_compute[183278]: 2026-01-21 18:13:18.758 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74_resize/disk 192.168.122.101:/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004-userdata-shm.mount: Deactivated successfully.
Jan 21 18:13:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-56c7ba45ce6762c2026e89103e7e455758fd70f1949cbb35cc0d90df4e2481c2-merged.mount: Deactivated successfully.
Jan 21 18:13:18 compute-0 podman[204151]: 2026-01-21 18:13:18.841407124 +0000 UTC m=+0.592228868 container cleanup eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:13:18 compute-0 systemd[1]: libpod-conmon-eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004.scope: Deactivated successfully.
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.111 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.243 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "scp -r /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74_resize/disk 192.168.122.101:/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.244 183284 DEBUG nova.virt.libvirt.volume.remotefs [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Copying file /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.244 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74_resize/disk.config 192.168.122.101:/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:19 compute-0 podman[204203]: 2026-01-21 18:13:19.455730446 +0000 UTC m=+0.592700780 container remove eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 18:13:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:19.462 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[de8025f2-cb75-4091-ba41-53bd2d3a25a9]: (4, ('Wed Jan 21 06:13:18 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b (eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004)\neda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004\nWed Jan 21 06:13:18 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b (eda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004)\neda7d6a895865894ab80827bc5a83a080d4cbe0a19a0e7c2603900db14a04004\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:19.464 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[565f4c94-79a5-45be-b6d1-8c379a012d9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:19.465 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8161199-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:13:19 compute-0 kernel: tape8161199-70: left promiscuous mode
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.467 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:19.473 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0a1d3f-6516-4cc0-8f52-7c1c6b5fad9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.481 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.485 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "scp -C -r /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74_resize/disk.config 192.168.122.101:/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.config" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.485 183284 DEBUG nova.virt.libvirt.volume.remotefs [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Copying file /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.486 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74_resize/disk.info 192.168.122.101:/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:19.494 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[af1472ae-c2ed-401c-ae34-79c3bbcc106f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:19.495 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d009857b-bfd7-488e-8685-84d96a627cf1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:19.508 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[3a13d808-5349-4c02-9306-d19b881fa699]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373133, 'reachable_time': 21930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 204222, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:19.516 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:13:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:19.517 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[dd1864d5-b64b-4a97-96d1-4717d942c9d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:19 compute-0 systemd[1]: run-netns-ovnmeta\x2de8161199\x2d7513\x2d4099\x2d89c4\x2d00e7e075c92b.mount: Deactivated successfully.
Jan 21 18:13:19 compute-0 nova_compute[183278]: 2026-01-21 18:13:19.697 183284 DEBUG oslo_concurrency.processutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "scp -C -r /var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74_resize/disk.info 192.168.122.101:/var/lib/nova/instances/841e0bef-3987-412a-805b-b71e87fa2a74/disk.info" returned: 0 in 0.211s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:20.069 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:20.069 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:20.069 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.555 183284 DEBUG neutronclient.v2_0.client [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port fa0544b4-ca76-47d6-a911-a353d9e6095f for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.716 183284 DEBUG oslo_concurrency.lockutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.716 183284 DEBUG oslo_concurrency.lockutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.719 183284 DEBUG nova.compute.manager [req-f544bb95-8ea1-4e24-917a-3af6d98da473 req-e3f0762a-37bb-492a-aea0-97aa17867139 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.720 183284 DEBUG oslo_concurrency.lockutils [req-f544bb95-8ea1-4e24-917a-3af6d98da473 req-e3f0762a-37bb-492a-aea0-97aa17867139 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.720 183284 DEBUG oslo_concurrency.lockutils [req-f544bb95-8ea1-4e24-917a-3af6d98da473 req-e3f0762a-37bb-492a-aea0-97aa17867139 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.720 183284 DEBUG oslo_concurrency.lockutils [req-f544bb95-8ea1-4e24-917a-3af6d98da473 req-e3f0762a-37bb-492a-aea0-97aa17867139 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.720 183284 DEBUG nova.compute.manager [req-f544bb95-8ea1-4e24-917a-3af6d98da473 req-e3f0762a-37bb-492a-aea0-97aa17867139 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] No waiting events found dispatching network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.720 183284 WARNING nova.compute.manager [req-f544bb95-8ea1-4e24-917a-3af6d98da473 req-e3f0762a-37bb-492a-aea0-97aa17867139 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received unexpected event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f for instance with vm_state active and task_state resize_migrating.
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.723 183284 INFO nova.compute.rpcapi [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.723 183284 DEBUG oslo_concurrency.lockutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.736 183284 DEBUG oslo_concurrency.lockutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.737 183284 DEBUG oslo_concurrency.lockutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:20 compute-0 nova_compute[183278]: 2026-01-21 18:13:20.737 183284 DEBUG oslo_concurrency.lockutils [None req-e0b28571-e376-4522-9a59-34e3784ec43d 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:22 compute-0 nova_compute[183278]: 2026-01-21 18:13:22.768 183284 DEBUG nova.compute.manager [req-96939c01-59ad-46ca-a177-ea3ff527fe27 req-d8879ad3-835b-498f-9d2e-ea592166f8d5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received event network-changed-fa0544b4-ca76-47d6-a911-a353d9e6095f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:13:22 compute-0 nova_compute[183278]: 2026-01-21 18:13:22.769 183284 DEBUG nova.compute.manager [req-96939c01-59ad-46ca-a177-ea3ff527fe27 req-d8879ad3-835b-498f-9d2e-ea592166f8d5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Refreshing instance network info cache due to event network-changed-fa0544b4-ca76-47d6-a911-a353d9e6095f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:13:22 compute-0 nova_compute[183278]: 2026-01-21 18:13:22.769 183284 DEBUG oslo_concurrency.lockutils [req-96939c01-59ad-46ca-a177-ea3ff527fe27 req-d8879ad3-835b-498f-9d2e-ea592166f8d5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:13:22 compute-0 nova_compute[183278]: 2026-01-21 18:13:22.769 183284 DEBUG oslo_concurrency.lockutils [req-96939c01-59ad-46ca-a177-ea3ff527fe27 req-d8879ad3-835b-498f-9d2e-ea592166f8d5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:13:22 compute-0 nova_compute[183278]: 2026-01-21 18:13:22.769 183284 DEBUG nova.network.neutron [req-96939c01-59ad-46ca-a177-ea3ff527fe27 req-d8879ad3-835b-498f-9d2e-ea592166f8d5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Refreshing network info cache for port fa0544b4-ca76-47d6-a911-a353d9e6095f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:13:23 compute-0 nova_compute[183278]: 2026-01-21 18:13:23.675 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:24 compute-0 nova_compute[183278]: 2026-01-21 18:13:24.113 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:24 compute-0 nova_compute[183278]: 2026-01-21 18:13:24.730 183284 DEBUG nova.network.neutron [req-96939c01-59ad-46ca-a177-ea3ff527fe27 req-d8879ad3-835b-498f-9d2e-ea592166f8d5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Updated VIF entry in instance network info cache for port fa0544b4-ca76-47d6-a911-a353d9e6095f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:13:24 compute-0 nova_compute[183278]: 2026-01-21 18:13:24.730 183284 DEBUG nova.network.neutron [req-96939c01-59ad-46ca-a177-ea3ff527fe27 req-d8879ad3-835b-498f-9d2e-ea592166f8d5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Updating instance_info_cache with network_info: [{"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:13:24 compute-0 nova_compute[183278]: 2026-01-21 18:13:24.767 183284 DEBUG oslo_concurrency.lockutils [req-96939c01-59ad-46ca-a177-ea3ff527fe27 req-d8879ad3-835b-498f-9d2e-ea592166f8d5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:13:25 compute-0 nova_compute[183278]: 2026-01-21 18:13:25.558 183284 DEBUG nova.compute.manager [req-f885b296-f7c8-4ec7-8e35-3d47239e4f01 req-036e1f0b-594d-4cdf-8989-533a2234cc78 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:13:25 compute-0 nova_compute[183278]: 2026-01-21 18:13:25.559 183284 DEBUG oslo_concurrency.lockutils [req-f885b296-f7c8-4ec7-8e35-3d47239e4f01 req-036e1f0b-594d-4cdf-8989-533a2234cc78 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:25 compute-0 nova_compute[183278]: 2026-01-21 18:13:25.559 183284 DEBUG oslo_concurrency.lockutils [req-f885b296-f7c8-4ec7-8e35-3d47239e4f01 req-036e1f0b-594d-4cdf-8989-533a2234cc78 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:25 compute-0 nova_compute[183278]: 2026-01-21 18:13:25.559 183284 DEBUG oslo_concurrency.lockutils [req-f885b296-f7c8-4ec7-8e35-3d47239e4f01 req-036e1f0b-594d-4cdf-8989-533a2234cc78 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:25 compute-0 nova_compute[183278]: 2026-01-21 18:13:25.559 183284 DEBUG nova.compute.manager [req-f885b296-f7c8-4ec7-8e35-3d47239e4f01 req-036e1f0b-594d-4cdf-8989-533a2234cc78 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] No waiting events found dispatching network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:13:25 compute-0 nova_compute[183278]: 2026-01-21 18:13:25.559 183284 WARNING nova.compute.manager [req-f885b296-f7c8-4ec7-8e35-3d47239e4f01 req-036e1f0b-594d-4cdf-8989-533a2234cc78 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received unexpected event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f for instance with vm_state active and task_state resize_finish.
Jan 21 18:13:27 compute-0 nova_compute[183278]: 2026-01-21 18:13:27.639 183284 DEBUG nova.compute.manager [req-675a71a6-1a97-475a-b1bf-f11f2cc57aea req-f1f3651d-9931-4749-b7bf-df64c3b5c97a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:13:27 compute-0 nova_compute[183278]: 2026-01-21 18:13:27.640 183284 DEBUG oslo_concurrency.lockutils [req-675a71a6-1a97-475a-b1bf-f11f2cc57aea req-f1f3651d-9931-4749-b7bf-df64c3b5c97a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:27 compute-0 nova_compute[183278]: 2026-01-21 18:13:27.640 183284 DEBUG oslo_concurrency.lockutils [req-675a71a6-1a97-475a-b1bf-f11f2cc57aea req-f1f3651d-9931-4749-b7bf-df64c3b5c97a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:27 compute-0 nova_compute[183278]: 2026-01-21 18:13:27.640 183284 DEBUG oslo_concurrency.lockutils [req-675a71a6-1a97-475a-b1bf-f11f2cc57aea req-f1f3651d-9931-4749-b7bf-df64c3b5c97a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:27 compute-0 nova_compute[183278]: 2026-01-21 18:13:27.640 183284 DEBUG nova.compute.manager [req-675a71a6-1a97-475a-b1bf-f11f2cc57aea req-f1f3651d-9931-4749-b7bf-df64c3b5c97a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] No waiting events found dispatching network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:13:27 compute-0 nova_compute[183278]: 2026-01-21 18:13:27.641 183284 WARNING nova.compute.manager [req-675a71a6-1a97-475a-b1bf-f11f2cc57aea req-f1f3651d-9931-4749-b7bf-df64c3b5c97a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Received unexpected event network-vif-plugged-fa0544b4-ca76-47d6-a911-a353d9e6095f for instance with vm_state resized and task_state None.
Jan 21 18:13:28 compute-0 nova_compute[183278]: 2026-01-21 18:13:28.680 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:29 compute-0 nova_compute[183278]: 2026-01-21 18:13:29.115 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:29 compute-0 nova_compute[183278]: 2026-01-21 18:13:29.709 183284 DEBUG oslo_concurrency.lockutils [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "841e0bef-3987-412a-805b-b71e87fa2a74" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:29 compute-0 nova_compute[183278]: 2026-01-21 18:13:29.709 183284 DEBUG oslo_concurrency.lockutils [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:29 compute-0 nova_compute[183278]: 2026-01-21 18:13:29.709 183284 DEBUG nova.compute.manager [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 21 18:13:29 compute-0 podman[192560]: time="2026-01-21T18:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:13:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:13:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 21 18:13:30 compute-0 nova_compute[183278]: 2026-01-21 18:13:30.562 183284 DEBUG neutronclient.v2_0.client [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port fa0544b4-ca76-47d6-a911-a353d9e6095f for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 21 18:13:30 compute-0 nova_compute[183278]: 2026-01-21 18:13:30.563 183284 DEBUG oslo_concurrency.lockutils [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:13:30 compute-0 nova_compute[183278]: 2026-01-21 18:13:30.563 183284 DEBUG oslo_concurrency.lockutils [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:13:30 compute-0 nova_compute[183278]: 2026-01-21 18:13:30.563 183284 DEBUG nova.network.neutron [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:13:30 compute-0 nova_compute[183278]: 2026-01-21 18:13:30.563 183284 DEBUG nova.objects.instance [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'info_cache' on Instance uuid 841e0bef-3987-412a-805b-b71e87fa2a74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:13:31 compute-0 openstack_network_exporter[195402]: ERROR   18:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:13:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:13:31 compute-0 openstack_network_exporter[195402]: ERROR   18:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:13:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.848 183284 DEBUG nova.network.neutron [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Updating instance_info_cache with network_info: [{"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.868 183284 DEBUG oslo_concurrency.lockutils [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-841e0bef-3987-412a-805b-b71e87fa2a74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.868 183284 DEBUG nova.objects.instance [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid 841e0bef-3987-412a-805b-b71e87fa2a74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.893 183284 DEBUG nova.virt.libvirt.host [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.894 183284 INFO nova.virt.libvirt.host [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] UEFI support detected
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.896 183284 DEBUG nova.virt.libvirt.vif [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:12:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-215034302',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-215034302',id=1,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:13:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-gzrkbma3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:13:26Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=841e0bef-3987-412a-805b-b71e87fa2a74,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.896 183284 DEBUG nova.network.os_vif_util [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "address": "fa:16:3e:ac:65:b9", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa0544b4-ca", "ovs_interfaceid": "fa0544b4-ca76-47d6-a911-a353d9e6095f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.897 183284 DEBUG nova.network.os_vif_util [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.897 183284 DEBUG os_vif [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.899 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.899 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa0544b4-ca, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.899 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.901 183284 INFO os_vif [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ac:65:b9,bridge_name='br-int',has_traffic_filtering=True,id=fa0544b4-ca76-47d6-a911-a353d9e6095f,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa0544b4-ca')
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.902 183284 DEBUG oslo_concurrency.lockutils [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.902 183284 DEBUG oslo_concurrency.lockutils [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:31 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.983 183284 DEBUG nova.compute.provider_tree [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:13:31 compute-0 podman[204226]: 2026-01-21 18:13:31.989041656 +0000 UTC m=+0.051111685 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 21 18:13:32 compute-0 nova_compute[183278]: 2026-01-21 18:13:31.998 183284 DEBUG nova.scheduler.client.report [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:13:32 compute-0 nova_compute[183278]: 2026-01-21 18:13:32.111 183284 DEBUG oslo_concurrency.lockutils [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:32 compute-0 nova_compute[183278]: 2026-01-21 18:13:32.284 183284 INFO nova.scheduler.client.report [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Deleted allocation for migration 2e557c2d-1c4a-4668-8ffb-7bff6d6d74bb
Jan 21 18:13:32 compute-0 nova_compute[183278]: 2026-01-21 18:13:32.346 183284 DEBUG oslo_concurrency.lockutils [None req-2f51a43a-86de-4e62-8c07-ab7ef8e4b0b4 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "841e0bef-3987-412a-805b-b71e87fa2a74" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:33 compute-0 nova_compute[183278]: 2026-01-21 18:13:33.440 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769019198.438881, 841e0bef-3987-412a-805b-b71e87fa2a74 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:13:33 compute-0 nova_compute[183278]: 2026-01-21 18:13:33.441 183284 INFO nova.compute.manager [-] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] VM Stopped (Lifecycle Event)
Jan 21 18:13:33 compute-0 nova_compute[183278]: 2026-01-21 18:13:33.460 183284 DEBUG nova.compute.manager [None req-db8fbc84-463e-41a4-a729-66caba16aee9 - - - - - -] [instance: 841e0bef-3987-412a-805b-b71e87fa2a74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:13:33 compute-0 nova_compute[183278]: 2026-01-21 18:13:33.683 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:34 compute-0 nova_compute[183278]: 2026-01-21 18:13:34.116 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:34 compute-0 sshd-session[204248]: Invalid user ansible_user from 64.227.98.100 port 53324
Jan 21 18:13:34 compute-0 sshd-session[204248]: Connection closed by invalid user ansible_user 64.227.98.100 port 53324 [preauth]
Jan 21 18:13:35 compute-0 nova_compute[183278]: 2026-01-21 18:13:35.787 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:35 compute-0 nova_compute[183278]: 2026-01-21 18:13:35.787 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:35 compute-0 nova_compute[183278]: 2026-01-21 18:13:35.826 183284 DEBUG nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:13:35 compute-0 nova_compute[183278]: 2026-01-21 18:13:35.888 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:35 compute-0 nova_compute[183278]: 2026-01-21 18:13:35.889 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:35 compute-0 nova_compute[183278]: 2026-01-21 18:13:35.897 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:13:35 compute-0 nova_compute[183278]: 2026-01-21 18:13:35.898 183284 INFO nova.compute.claims [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:13:35 compute-0 nova_compute[183278]: 2026-01-21 18:13:35.989 183284 DEBUG nova.compute.provider_tree [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.008 183284 DEBUG nova.scheduler.client.report [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.033 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.034 183284 DEBUG nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.083 183284 DEBUG nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.083 183284 DEBUG nova.network.neutron [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.121 183284 INFO nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.137 183284 DEBUG nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.212 183284 DEBUG nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.213 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.213 183284 INFO nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Creating image(s)
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.214 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "/var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.214 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "/var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.215 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "/var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.226 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.280 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.280 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.281 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.293 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.349 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.351 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.384 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.385 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.386 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.444 183284 DEBUG nova.policy [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16f8ab2ae83b48f9a88753a5deddcc19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.446 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.447 183284 DEBUG nova.virt.disk.api [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Checking if we can resize image /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.448 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.505 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.506 183284 DEBUG nova.virt.disk.api [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Cannot resize image /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.507 183284 DEBUG nova.objects.instance [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'migration_context' on Instance uuid b499883a-ee9f-4239-b996-4fbaa175bcc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.531 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.531 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Ensure instance console log exists: /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.532 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.532 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.532 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:36 compute-0 nova_compute[183278]: 2026-01-21 18:13:36.940 183284 DEBUG nova.network.neutron [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Successfully created port: 46d8fa01-a7bd-4849-904d-01dc9c7071a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:13:37 compute-0 nova_compute[183278]: 2026-01-21 18:13:37.617 183284 DEBUG nova.network.neutron [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Successfully updated port: 46d8fa01-a7bd-4849-904d-01dc9c7071a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:13:37 compute-0 nova_compute[183278]: 2026-01-21 18:13:37.636 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:13:37 compute-0 nova_compute[183278]: 2026-01-21 18:13:37.636 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquired lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:13:37 compute-0 nova_compute[183278]: 2026-01-21 18:13:37.636 183284 DEBUG nova.network.neutron [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:13:37 compute-0 nova_compute[183278]: 2026-01-21 18:13:37.719 183284 DEBUG nova.compute.manager [req-2bdf6089-d3ae-408a-ad9f-fb86d662199b req-6f1f364f-0a24-4a3e-a78b-3a9353f40b79 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-changed-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:13:37 compute-0 nova_compute[183278]: 2026-01-21 18:13:37.719 183284 DEBUG nova.compute.manager [req-2bdf6089-d3ae-408a-ad9f-fb86d662199b req-6f1f364f-0a24-4a3e-a78b-3a9353f40b79 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Refreshing instance network info cache due to event network-changed-46d8fa01-a7bd-4849-904d-01dc9c7071a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:13:37 compute-0 nova_compute[183278]: 2026-01-21 18:13:37.720 183284 DEBUG oslo_concurrency.lockutils [req-2bdf6089-d3ae-408a-ad9f-fb86d662199b req-6f1f364f-0a24-4a3e-a78b-3a9353f40b79 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:13:37 compute-0 nova_compute[183278]: 2026-01-21 18:13:37.892 183284 DEBUG nova.network.neutron [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.558 183284 DEBUG nova.network.neutron [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Updating instance_info_cache with network_info: [{"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.686 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.823 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Releasing lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.823 183284 DEBUG nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Instance network_info: |[{"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.824 183284 DEBUG oslo_concurrency.lockutils [req-2bdf6089-d3ae-408a-ad9f-fb86d662199b req-6f1f364f-0a24-4a3e-a78b-3a9353f40b79 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.824 183284 DEBUG nova.network.neutron [req-2bdf6089-d3ae-408a-ad9f-fb86d662199b req-6f1f364f-0a24-4a3e-a78b-3a9353f40b79 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Refreshing network info cache for port 46d8fa01-a7bd-4849-904d-01dc9c7071a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.826 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Start _get_guest_xml network_info=[{"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.830 183284 WARNING nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.834 183284 DEBUG nova.virt.libvirt.host [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.834 183284 DEBUG nova.virt.libvirt.host [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.837 183284 DEBUG nova.virt.libvirt.host [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.837 183284 DEBUG nova.virt.libvirt.host [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.839 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.839 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.839 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.840 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.840 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.840 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.840 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.840 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.841 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.841 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.841 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.841 183284 DEBUG nova.virt.hardware [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.845 183284 DEBUG nova.virt.libvirt.vif [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:13:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1264591926',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1264591926',id=3,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-9cw14hr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:13:36Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=b499883a-ee9f-4239-b996-4fbaa175bcc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.845 183284 DEBUG nova.network.os_vif_util [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.846 183284 DEBUG nova.network.os_vif_util [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:dd:15,bridge_name='br-int',has_traffic_filtering=True,id=46d8fa01-a7bd-4849-904d-01dc9c7071a4,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46d8fa01-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.847 183284 DEBUG nova.objects.instance [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'pci_devices' on Instance uuid b499883a-ee9f-4239-b996-4fbaa175bcc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.859 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <uuid>b499883a-ee9f-4239-b996-4fbaa175bcc3</uuid>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <name>instance-00000003</name>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1264591926</nova:name>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:13:38</nova:creationTime>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:13:38 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:13:38 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:13:38 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:13:38 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:13:38 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:13:38 compute-0 nova_compute[183278]:         <nova:user uuid="16f8ab2ae83b48f9a88753a5deddcc19">tempest-TestExecuteActionsViaActuator-627352265-project-member</nova:user>
Jan 21 18:13:38 compute-0 nova_compute[183278]:         <nova:project uuid="2a4b7cdf556d4f8393d1c61b57628813">tempest-TestExecuteActionsViaActuator-627352265</nova:project>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:13:38 compute-0 nova_compute[183278]:         <nova:port uuid="46d8fa01-a7bd-4849-904d-01dc9c7071a4">
Jan 21 18:13:38 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <system>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <entry name="serial">b499883a-ee9f-4239-b996-4fbaa175bcc3</entry>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <entry name="uuid">b499883a-ee9f-4239-b996-4fbaa175bcc3</entry>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     </system>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <os>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   </os>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <features>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   </features>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk.config"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:39:dd:15"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <target dev="tap46d8fa01-a7"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/console.log" append="off"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <video>
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     </video>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:13:38 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:13:38 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:13:38 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:13:38 compute-0 nova_compute[183278]: </domain>
Jan 21 18:13:38 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.860 183284 DEBUG nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Preparing to wait for external event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.860 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.860 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.861 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.861 183284 DEBUG nova.virt.libvirt.vif [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:13:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1264591926',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1264591926',id=3,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-9cw14hr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:13:36Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=b499883a-ee9f-4239-b996-4fbaa175bcc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.861 183284 DEBUG nova.network.os_vif_util [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.862 183284 DEBUG nova.network.os_vif_util [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:dd:15,bridge_name='br-int',has_traffic_filtering=True,id=46d8fa01-a7bd-4849-904d-01dc9c7071a4,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46d8fa01-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.862 183284 DEBUG os_vif [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:dd:15,bridge_name='br-int',has_traffic_filtering=True,id=46d8fa01-a7bd-4849-904d-01dc9c7071a4,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46d8fa01-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.863 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.863 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.863 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.865 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.866 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap46d8fa01-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.866 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap46d8fa01-a7, col_values=(('external_ids', {'iface-id': '46d8fa01-a7bd-4849-904d-01dc9c7071a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:dd:15', 'vm-uuid': 'b499883a-ee9f-4239-b996-4fbaa175bcc3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.867 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:38 compute-0 NetworkManager[55506]: <info>  [1769019218.8684] manager: (tap46d8fa01-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.871 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.875 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.876 183284 INFO os_vif [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:dd:15,bridge_name='br-int',has_traffic_filtering=True,id=46d8fa01-a7bd-4849-904d-01dc9c7071a4,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46d8fa01-a7')
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.923 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.923 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.923 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No VIF found with MAC fa:16:3e:39:dd:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:13:38 compute-0 nova_compute[183278]: 2026-01-21 18:13:38.924 183284 INFO nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Using config drive
Jan 21 18:13:38 compute-0 podman[204269]: 2026-01-21 18:13:38.960722003 +0000 UTC m=+0.046975776 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 18:13:39 compute-0 podman[204268]: 2026-01-21 18:13:39.019295219 +0000 UTC m=+0.106440144 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.118 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.289 183284 INFO nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Creating config drive at /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk.config
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.295 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmo4gcdku execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.421 183284 DEBUG oslo_concurrency.processutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmo4gcdku" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:13:39 compute-0 kernel: tap46d8fa01-a7: entered promiscuous mode
Jan 21 18:13:39 compute-0 NetworkManager[55506]: <info>  [1769019219.4740] manager: (tap46d8fa01-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Jan 21 18:13:39 compute-0 ovn_controller[95419]: 2026-01-21T18:13:39Z|00035|binding|INFO|Claiming lport 46d8fa01-a7bd-4849-904d-01dc9c7071a4 for this chassis.
Jan 21 18:13:39 compute-0 ovn_controller[95419]: 2026-01-21T18:13:39Z|00036|binding|INFO|46d8fa01-a7bd-4849-904d-01dc9c7071a4: Claiming fa:16:3e:39:dd:15 10.100.0.12
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.473 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.482 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:dd:15 10.100.0.12'], port_security=['fa:16:3e:39:dd:15 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b499883a-ee9f-4239-b996-4fbaa175bcc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8161199-7513-4099-89c4-00e7e075c92b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c5ea7560-106a-40fd-a00a-355d8be6545e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afee9644-a390-49fb-b346-3fd1c948feef, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=46d8fa01-a7bd-4849-904d-01dc9c7071a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.483 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 46d8fa01-a7bd-4849-904d-01dc9c7071a4 in datapath e8161199-7513-4099-89c4-00e7e075c92b bound to our chassis
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.484 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:13:39 compute-0 ovn_controller[95419]: 2026-01-21T18:13:39Z|00037|binding|INFO|Setting lport 46d8fa01-a7bd-4849-904d-01dc9c7071a4 ovn-installed in OVS
Jan 21 18:13:39 compute-0 ovn_controller[95419]: 2026-01-21T18:13:39Z|00038|binding|INFO|Setting lport 46d8fa01-a7bd-4849-904d-01dc9c7071a4 up in Southbound
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.488 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.489 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.496 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[cb0eee61-9f0c-4844-b906-8f2b106579f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.497 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape8161199-71 in ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.498 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape8161199-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.499 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e8653637-9c9a-4707-a0bb-f171b24ac59e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.499 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0fe43b0f-f8f6-4074-acc8-899aa30d2871]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 systemd-udevd[204332]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:13:39 compute-0 systemd-machined[154592]: New machine qemu-2-instance-00000003.
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.511 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[972b18b9-89ba-4e9c-8494-dae48ac3fada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 NetworkManager[55506]: <info>  [1769019219.5150] device (tap46d8fa01-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:13:39 compute-0 NetworkManager[55506]: <info>  [1769019219.5160] device (tap46d8fa01-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:13:39 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.526 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e9322a20-18af-4044-b1ed-6b0fe709053c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.551 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[16002243-2d20-4523-adfb-84ef6f4992ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.555 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[360b1d4a-4602-4820-bc70-8e3a0b0a3f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 NetworkManager[55506]: <info>  [1769019219.5562] manager: (tape8161199-70): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.581 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[4db30de9-d5ee-46e6-81f1-3acbedbc458c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.585 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[8632a780-2c35-4e30-8207-531aafbc4183]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 NetworkManager[55506]: <info>  [1769019219.6063] device (tape8161199-70): carrier: link connected
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.611 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb3410e-ca06-4e81-bf06-0048225d838f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.626 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[efbe2593-9fac-499b-8020-f9df8d941500]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8161199-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:ce:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378321, 'reachable_time': 16740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 204364, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.641 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[353b19f7-03fa-42c3-90ab-74a1eee2121b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:ce84'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378321, 'tstamp': 378321}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204365, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.653 183284 DEBUG nova.compute.manager [req-55fb2c00-e3de-4a39-bb83-3dcecc44dd38 req-601bb226-0de1-4b14-935b-c0b14d159092 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.654 183284 DEBUG oslo_concurrency.lockutils [req-55fb2c00-e3de-4a39-bb83-3dcecc44dd38 req-601bb226-0de1-4b14-935b-c0b14d159092 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.654 183284 DEBUG oslo_concurrency.lockutils [req-55fb2c00-e3de-4a39-bb83-3dcecc44dd38 req-601bb226-0de1-4b14-935b-c0b14d159092 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.654 183284 DEBUG oslo_concurrency.lockutils [req-55fb2c00-e3de-4a39-bb83-3dcecc44dd38 req-601bb226-0de1-4b14-935b-c0b14d159092 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.654 183284 DEBUG nova.compute.manager [req-55fb2c00-e3de-4a39-bb83-3dcecc44dd38 req-601bb226-0de1-4b14-935b-c0b14d159092 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Processing event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.656 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcc5218-2583-42e0-803b-89162be9bd3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8161199-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:ce:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378321, 'reachable_time': 16740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 204366, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.684 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7212914e-10f0-40fe-aa7a-f5689ed9f127]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.737 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb3ae50-ba69-4486-8121-628e084d51c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.738 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8161199-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.738 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.739 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8161199-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.791 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:39 compute-0 NetworkManager[55506]: <info>  [1769019219.7940] manager: (tape8161199-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 21 18:13:39 compute-0 kernel: tape8161199-70: entered promiscuous mode
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.798 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.799 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape8161199-70, col_values=(('external_ids', {'iface-id': 'd5993779-4a27-48a2-a904-ec457f58cb35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:13:39 compute-0 ovn_controller[95419]: 2026-01-21T18:13:39Z|00039|binding|INFO|Releasing lport d5993779-4a27-48a2-a904-ec457f58cb35 from this chassis (sb_readonly=0)
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.802 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.818 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.818 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e8161199-7513-4099-89c4-00e7e075c92b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e8161199-7513-4099-89c4-00e7e075c92b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.819 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a932da21-0275-4251-94ba-d953ab6ff30d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.820 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/e8161199-7513-4099-89c4-00e7e075c92b.pid.haproxy
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:13:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:13:39.820 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'env', 'PROCESS_TAG=haproxy-e8161199-7513-4099-89c4-00e7e075c92b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e8161199-7513-4099-89c4-00e7e075c92b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.906 183284 DEBUG nova.network.neutron [req-2bdf6089-d3ae-408a-ad9f-fb86d662199b req-6f1f364f-0a24-4a3e-a78b-3a9353f40b79 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Updated VIF entry in instance network info cache for port 46d8fa01-a7bd-4849-904d-01dc9c7071a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.906 183284 DEBUG nova.network.neutron [req-2bdf6089-d3ae-408a-ad9f-fb86d662199b req-6f1f364f-0a24-4a3e-a78b-3a9353f40b79 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Updating instance_info_cache with network_info: [{"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:13:39 compute-0 nova_compute[183278]: 2026-01-21 18:13:39.928 183284 DEBUG oslo_concurrency.lockutils [req-2bdf6089-d3ae-408a-ad9f-fb86d662199b req-6f1f364f-0a24-4a3e-a78b-3a9353f40b79 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.140 183284 DEBUG nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.141 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019220.140223, b499883a-ee9f-4239-b996-4fbaa175bcc3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.141 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] VM Started (Lifecycle Event)
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.143 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.146 183284 INFO nova.virt.libvirt.driver [-] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Instance spawned successfully.
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.146 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.164 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.169 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.171 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.172 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.172 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.172 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.173 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.173 183284 DEBUG nova.virt.libvirt.driver [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.199 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.199 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019220.1434622, b499883a-ee9f-4239-b996-4fbaa175bcc3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.199 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] VM Paused (Lifecycle Event)
Jan 21 18:13:40 compute-0 podman[204404]: 2026-01-21 18:13:40.132083162 +0000 UTC m=+0.030844567 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.285 183284 INFO nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Took 4.07 seconds to spawn the instance on the hypervisor.
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.286 183284 DEBUG nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.288 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.298 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019220.1442797, b499883a-ee9f-4239-b996-4fbaa175bcc3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.299 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] VM Resumed (Lifecycle Event)
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.338 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.340 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.357 183284 INFO nova.compute.manager [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Took 4.49 seconds to build instance.
Jan 21 18:13:40 compute-0 nova_compute[183278]: 2026-01-21 18:13:40.371 183284 DEBUG oslo_concurrency.lockutils [None req-b1702a6d-cd84-4485-805b-5a6efa95b1a0 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:40 compute-0 podman[204404]: 2026-01-21 18:13:40.521190962 +0000 UTC m=+0.419952357 container create 9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:13:40 compute-0 systemd[1]: Started libpod-conmon-9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730.scope.
Jan 21 18:13:40 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:13:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45ecf4c57b028f6f42299024e7748d826f023a4d0b4516648c3499cb0c3c19c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:13:40 compute-0 podman[204404]: 2026-01-21 18:13:40.587595317 +0000 UTC m=+0.486356722 container init 9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 18:13:40 compute-0 podman[204404]: 2026-01-21 18:13:40.592924615 +0000 UTC m=+0.491686010 container start 9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 18:13:40 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[204420]: [NOTICE]   (204424) : New worker (204426) forked
Jan 21 18:13:40 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[204420]: [NOTICE]   (204424) : Loading success.
Jan 21 18:13:41 compute-0 nova_compute[183278]: 2026-01-21 18:13:41.732 183284 DEBUG nova.compute.manager [req-bc0077c1-75af-49c3-902e-f1aaf0f82974 req-4fb57c6e-3c21-4bdb-a240-c675c92d1a1d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:13:41 compute-0 nova_compute[183278]: 2026-01-21 18:13:41.733 183284 DEBUG oslo_concurrency.lockutils [req-bc0077c1-75af-49c3-902e-f1aaf0f82974 req-4fb57c6e-3c21-4bdb-a240-c675c92d1a1d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:13:41 compute-0 nova_compute[183278]: 2026-01-21 18:13:41.733 183284 DEBUG oslo_concurrency.lockutils [req-bc0077c1-75af-49c3-902e-f1aaf0f82974 req-4fb57c6e-3c21-4bdb-a240-c675c92d1a1d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:13:41 compute-0 nova_compute[183278]: 2026-01-21 18:13:41.733 183284 DEBUG oslo_concurrency.lockutils [req-bc0077c1-75af-49c3-902e-f1aaf0f82974 req-4fb57c6e-3c21-4bdb-a240-c675c92d1a1d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:13:41 compute-0 nova_compute[183278]: 2026-01-21 18:13:41.733 183284 DEBUG nova.compute.manager [req-bc0077c1-75af-49c3-902e-f1aaf0f82974 req-4fb57c6e-3c21-4bdb-a240-c675c92d1a1d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] No waiting events found dispatching network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:13:41 compute-0 nova_compute[183278]: 2026-01-21 18:13:41.733 183284 WARNING nova.compute.manager [req-bc0077c1-75af-49c3-902e-f1aaf0f82974 req-4fb57c6e-3c21-4bdb-a240-c675c92d1a1d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received unexpected event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 for instance with vm_state active and task_state None.
Jan 21 18:13:43 compute-0 podman[204435]: 2026-01-21 18:13:43.000831527 +0000 UTC m=+0.055541142 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:13:43 compute-0 nova_compute[183278]: 2026-01-21 18:13:43.868 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:44 compute-0 nova_compute[183278]: 2026-01-21 18:13:44.119 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:48 compute-0 nova_compute[183278]: 2026-01-21 18:13:48.870 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:49 compute-0 nova_compute[183278]: 2026-01-21 18:13:49.121 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:53 compute-0 ovn_controller[95419]: 2026-01-21T18:13:53Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:dd:15 10.100.0.12
Jan 21 18:13:53 compute-0 ovn_controller[95419]: 2026-01-21T18:13:53Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:dd:15 10.100.0.12
Jan 21 18:13:53 compute-0 nova_compute[183278]: 2026-01-21 18:13:53.873 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:54 compute-0 nova_compute[183278]: 2026-01-21 18:13:54.123 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:58 compute-0 nova_compute[183278]: 2026-01-21 18:13:58.877 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:59 compute-0 nova_compute[183278]: 2026-01-21 18:13:59.128 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:13:59 compute-0 podman[192560]: time="2026-01-21T18:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:13:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:13:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2630 "" "Go-http-client/1.1"
Jan 21 18:14:01 compute-0 openstack_network_exporter[195402]: ERROR   18:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:14:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:14:01 compute-0 openstack_network_exporter[195402]: ERROR   18:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:14:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:14:02 compute-0 nova_compute[183278]: 2026-01-21 18:14:02.738 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:02 compute-0 nova_compute[183278]: 2026-01-21 18:14:02.738 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:02 compute-0 nova_compute[183278]: 2026-01-21 18:14:02.771 183284 DEBUG nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:14:02 compute-0 nova_compute[183278]: 2026-01-21 18:14:02.872 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:02 compute-0 nova_compute[183278]: 2026-01-21 18:14:02.872 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:02 compute-0 nova_compute[183278]: 2026-01-21 18:14:02.909 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:14:02 compute-0 nova_compute[183278]: 2026-01-21 18:14:02.909 183284 INFO nova.compute.claims [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:14:03 compute-0 podman[204485]: 2026-01-21 18:14:02.99867378 +0000 UTC m=+0.052470544 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.112 183284 DEBUG nova.compute.provider_tree [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.124 183284 DEBUG nova.scheduler.client.report [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.143 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.144 183284 DEBUG nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.184 183284 DEBUG nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.185 183284 DEBUG nova.network.neutron [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.204 183284 INFO nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.220 183284 DEBUG nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.318 183284 DEBUG nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.320 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.320 183284 INFO nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Creating image(s)
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.320 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "/var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.321 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "/var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.321 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "/var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.334 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.358 183284 DEBUG nova.policy [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16f8ab2ae83b48f9a88753a5deddcc19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.388 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.389 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.389 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.400 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.455 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.456 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.544 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk 1073741824" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.546 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.546 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.600 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.601 183284 DEBUG nova.virt.disk.api [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Checking if we can resize image /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.602 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.656 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.657 183284 DEBUG nova.virt.disk.api [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Cannot resize image /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.658 183284 DEBUG nova.objects.instance [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'migration_context' on Instance uuid b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.670 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.671 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Ensure instance console log exists: /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.671 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.672 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.672 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:03.754 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:14:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:03.755 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.781 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.819 183284 DEBUG nova.network.neutron [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Successfully created port: 9cf7b098-bd13-4dfa-8995-578571b027a3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:14:03 compute-0 nova_compute[183278]: 2026-01-21 18:14:03.880 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.130 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.484 183284 DEBUG nova.network.neutron [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Successfully updated port: 9cf7b098-bd13-4dfa-8995-578571b027a3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.508 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "refresh_cache-b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.508 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquired lock "refresh_cache-b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.508 183284 DEBUG nova.network.neutron [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.562 183284 DEBUG nova.compute.manager [req-ea67685f-3548-41cb-ab30-14dec8c659c0 req-14999162-3f86-43ed-8843-e21dda13ad61 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Received event network-changed-9cf7b098-bd13-4dfa-8995-578571b027a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.563 183284 DEBUG nova.compute.manager [req-ea67685f-3548-41cb-ab30-14dec8c659c0 req-14999162-3f86-43ed-8843-e21dda13ad61 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Refreshing instance network info cache due to event network-changed-9cf7b098-bd13-4dfa-8995-578571b027a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.563 183284 DEBUG oslo_concurrency.lockutils [req-ea67685f-3548-41cb-ab30-14dec8c659c0 req-14999162-3f86-43ed-8843-e21dda13ad61 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.626 183284 DEBUG nova.network.neutron [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:14:04 compute-0 nova_compute[183278]: 2026-01-21 18:14:04.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.365 183284 DEBUG nova.network.neutron [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Updating instance_info_cache with network_info: [{"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.389 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Releasing lock "refresh_cache-b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.389 183284 DEBUG nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Instance network_info: |[{"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.389 183284 DEBUG oslo_concurrency.lockutils [req-ea67685f-3548-41cb-ab30-14dec8c659c0 req-14999162-3f86-43ed-8843-e21dda13ad61 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.390 183284 DEBUG nova.network.neutron [req-ea67685f-3548-41cb-ab30-14dec8c659c0 req-14999162-3f86-43ed-8843-e21dda13ad61 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Refreshing network info cache for port 9cf7b098-bd13-4dfa-8995-578571b027a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.393 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Start _get_guest_xml network_info=[{"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.396 183284 WARNING nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.400 183284 DEBUG nova.virt.libvirt.host [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.401 183284 DEBUG nova.virt.libvirt.host [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.406 183284 DEBUG nova.virt.libvirt.host [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.406 183284 DEBUG nova.virt.libvirt.host [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.408 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.408 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.408 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.409 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.409 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.409 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.409 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.409 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.410 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.410 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.410 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.410 183284 DEBUG nova.virt.hardware [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.414 183284 DEBUG nova.virt.libvirt.vif [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:14:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1134094695',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1134094695',id=5,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-zgv0rt21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:14:03Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.414 183284 DEBUG nova.network.os_vif_util [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.415 183284 DEBUG nova.network.os_vif_util [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:14:07,bridge_name='br-int',has_traffic_filtering=True,id=9cf7b098-bd13-4dfa-8995-578571b027a3,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cf7b098-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.416 183284 DEBUG nova.objects.instance [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'pci_devices' on Instance uuid b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.429 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <uuid>b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9</uuid>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <name>instance-00000005</name>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1134094695</nova:name>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:14:05</nova:creationTime>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:14:05 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:14:05 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:14:05 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:14:05 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:14:05 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:14:05 compute-0 nova_compute[183278]:         <nova:user uuid="16f8ab2ae83b48f9a88753a5deddcc19">tempest-TestExecuteActionsViaActuator-627352265-project-member</nova:user>
Jan 21 18:14:05 compute-0 nova_compute[183278]:         <nova:project uuid="2a4b7cdf556d4f8393d1c61b57628813">tempest-TestExecuteActionsViaActuator-627352265</nova:project>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:14:05 compute-0 nova_compute[183278]:         <nova:port uuid="9cf7b098-bd13-4dfa-8995-578571b027a3">
Jan 21 18:14:05 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <system>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <entry name="serial">b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9</entry>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <entry name="uuid">b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9</entry>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     </system>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <os>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   </os>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <features>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   </features>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk.config"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:bf:14:07"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <target dev="tap9cf7b098-bd"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/console.log" append="off"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <video>
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     </video>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:14:05 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:14:05 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:14:05 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:14:05 compute-0 nova_compute[183278]: </domain>
Jan 21 18:14:05 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.431 183284 DEBUG nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Preparing to wait for external event network-vif-plugged-9cf7b098-bd13-4dfa-8995-578571b027a3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.431 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.431 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.431 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.432 183284 DEBUG nova.virt.libvirt.vif [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:14:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1134094695',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1134094695',id=5,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-zgv0rt21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:14:03Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.432 183284 DEBUG nova.network.os_vif_util [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.433 183284 DEBUG nova.network.os_vif_util [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:14:07,bridge_name='br-int',has_traffic_filtering=True,id=9cf7b098-bd13-4dfa-8995-578571b027a3,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cf7b098-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.433 183284 DEBUG os_vif [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:14:07,bridge_name='br-int',has_traffic_filtering=True,id=9cf7b098-bd13-4dfa-8995-578571b027a3,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cf7b098-bd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.434 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.434 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.434 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.437 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.437 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf7b098-bd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.438 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9cf7b098-bd, col_values=(('external_ids', {'iface-id': '9cf7b098-bd13-4dfa-8995-578571b027a3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:14:07', 'vm-uuid': 'b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.439 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:05 compute-0 NetworkManager[55506]: <info>  [1769019245.4400] manager: (tap9cf7b098-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.441 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.446 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.447 183284 INFO os_vif [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:14:07,bridge_name='br-int',has_traffic_filtering=True,id=9cf7b098-bd13-4dfa-8995-578571b027a3,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cf7b098-bd')
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.532 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.532 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.532 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No VIF found with MAC fa:16:3e:bf:14:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.533 183284 INFO nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Using config drive
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:14:05 compute-0 nova_compute[183278]: 2026-01-21 18:14:05.849 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 21 18:14:06 compute-0 nova_compute[183278]: 2026-01-21 18:14:06.441 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:14:06 compute-0 nova_compute[183278]: 2026-01-21 18:14:06.441 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:14:06 compute-0 nova_compute[183278]: 2026-01-21 18:14:06.441 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:14:06 compute-0 nova_compute[183278]: 2026-01-21 18:14:06.441 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid b499883a-ee9f-4239-b996-4fbaa175bcc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.424 183284 INFO nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Creating config drive at /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk.config
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.430 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1_0ub0gf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.552 183284 DEBUG oslo_concurrency.processutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1_0ub0gf" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:07 compute-0 kernel: tap9cf7b098-bd: entered promiscuous mode
Jan 21 18:14:07 compute-0 NetworkManager[55506]: <info>  [1769019247.6037] manager: (tap9cf7b098-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Jan 21 18:14:07 compute-0 ovn_controller[95419]: 2026-01-21T18:14:07Z|00040|binding|INFO|Claiming lport 9cf7b098-bd13-4dfa-8995-578571b027a3 for this chassis.
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.605 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:07 compute-0 ovn_controller[95419]: 2026-01-21T18:14:07Z|00041|binding|INFO|9cf7b098-bd13-4dfa-8995-578571b027a3: Claiming fa:16:3e:bf:14:07 10.100.0.6
Jan 21 18:14:07 compute-0 ovn_controller[95419]: 2026-01-21T18:14:07Z|00042|binding|INFO|Setting lport 9cf7b098-bd13-4dfa-8995-578571b027a3 ovn-installed in OVS
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.619 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.622 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:14:07 10.100.0.6'], port_security=['fa:16:3e:bf:14:07 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8161199-7513-4099-89c4-00e7e075c92b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c5ea7560-106a-40fd-a00a-355d8be6545e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afee9644-a390-49fb-b346-3fd1c948feef, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=9cf7b098-bd13-4dfa-8995-578571b027a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:14:07 compute-0 ovn_controller[95419]: 2026-01-21T18:14:07Z|00043|binding|INFO|Setting lport 9cf7b098-bd13-4dfa-8995-578571b027a3 up in Southbound
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.622 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.623 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 9cf7b098-bd13-4dfa-8995-578571b027a3 in datapath e8161199-7513-4099-89c4-00e7e075c92b bound to our chassis
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.624 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:14:07 compute-0 systemd-udevd[204542]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.639 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[381bbe72-4889-4589-9100-bbaa4a28aa6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:07 compute-0 systemd-machined[154592]: New machine qemu-3-instance-00000005.
Jan 21 18:14:07 compute-0 NetworkManager[55506]: <info>  [1769019247.6471] device (tap9cf7b098-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:14:07 compute-0 NetworkManager[55506]: <info>  [1769019247.6476] device (tap9cf7b098-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:14:07 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000005.
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.670 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[65e7adc4-f3c9-4fce-a904-eb681939eb3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.674 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[6faa3a6e-9a52-4422-a8f1-7bcf97099c48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.701 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[3155f34f-4e5e-461d-87fa-4eca18e14244]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.718 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[1066b528-b227-42cf-92f1-32a5b5791ac3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8161199-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:ce:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378321, 'reachable_time': 16740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 204555, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.734 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f147fc-e28f-45cb-a64e-720fafc8c59e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape8161199-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378331, 'tstamp': 378331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204557, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape8161199-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378334, 'tstamp': 378334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204557, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.736 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8161199-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.738 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.739 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8161199-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.739 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.740 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape8161199-70, col_values=(('external_ids', {'iface-id': 'd5993779-4a27-48a2-a904-ec457f58cb35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:07.740 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.923 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019247.9224236, b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.923 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] VM Started (Lifecycle Event)
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.940 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.944 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019247.9230494, b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.944 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] VM Paused (Lifecycle Event)
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.964 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:14:07 compute-0 nova_compute[183278]: 2026-01-21 18:14:07.967 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:14:08 compute-0 nova_compute[183278]: 2026-01-21 18:14:08.009 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.132 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.529 183284 DEBUG nova.compute.manager [req-b3545e9b-9484-4161-ad4c-98c1df5b5db4 req-18fd521a-c0cb-4127-bb5f-edb3733917b8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Received event network-vif-plugged-9cf7b098-bd13-4dfa-8995-578571b027a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.529 183284 DEBUG oslo_concurrency.lockutils [req-b3545e9b-9484-4161-ad4c-98c1df5b5db4 req-18fd521a-c0cb-4127-bb5f-edb3733917b8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.529 183284 DEBUG oslo_concurrency.lockutils [req-b3545e9b-9484-4161-ad4c-98c1df5b5db4 req-18fd521a-c0cb-4127-bb5f-edb3733917b8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.530 183284 DEBUG oslo_concurrency.lockutils [req-b3545e9b-9484-4161-ad4c-98c1df5b5db4 req-18fd521a-c0cb-4127-bb5f-edb3733917b8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.530 183284 DEBUG nova.compute.manager [req-b3545e9b-9484-4161-ad4c-98c1df5b5db4 req-18fd521a-c0cb-4127-bb5f-edb3733917b8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Processing event network-vif-plugged-9cf7b098-bd13-4dfa-8995-578571b027a3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.531 183284 DEBUG nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.535 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.535 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019249.5348525, b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.535 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] VM Resumed (Lifecycle Event)
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.540 183284 INFO nova.virt.libvirt.driver [-] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Instance spawned successfully.
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.541 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.568 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.574 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.578 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.578 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.579 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.579 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.580 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.580 183284 DEBUG nova.virt.libvirt.driver [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.622 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.664 183284 INFO nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Took 6.34 seconds to spawn the instance on the hypervisor.
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.664 183284 DEBUG nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.784 183284 INFO nova.compute.manager [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Took 6.95 seconds to build instance.
Jan 21 18:14:09 compute-0 nova_compute[183278]: 2026-01-21 18:14:09.811 183284 DEBUG oslo_concurrency.lockutils [None req-b0aadabe-1d72-48e8-91f9-cc41c9c287c9 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:10 compute-0 podman[204566]: 2026-01-21 18:14:10.000907387 +0000 UTC m=+0.052175416 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 21 18:14:10 compute-0 podman[204565]: 2026-01-21 18:14:10.060484683 +0000 UTC m=+0.111316412 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:14:10 compute-0 nova_compute[183278]: 2026-01-21 18:14:10.440 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:10 compute-0 nova_compute[183278]: 2026-01-21 18:14:10.442 183284 DEBUG nova.network.neutron [req-ea67685f-3548-41cb-ab30-14dec8c659c0 req-14999162-3f86-43ed-8843-e21dda13ad61 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Updated VIF entry in instance network info cache for port 9cf7b098-bd13-4dfa-8995-578571b027a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:14:10 compute-0 nova_compute[183278]: 2026-01-21 18:14:10.442 183284 DEBUG nova.network.neutron [req-ea67685f-3548-41cb-ab30-14dec8c659c0 req-14999162-3f86-43ed-8843-e21dda13ad61 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Updating instance_info_cache with network_info: [{"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:14:10 compute-0 nova_compute[183278]: 2026-01-21 18:14:10.592 183284 DEBUG oslo_concurrency.lockutils [req-ea67685f-3548-41cb-ab30-14dec8c659c0 req-14999162-3f86-43ed-8843-e21dda13ad61 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:14:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:10.760 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.496 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Updating instance_info_cache with network_info: [{"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.520 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.520 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.520 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.521 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.521 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.521 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.522 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.522 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.550 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.550 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.551 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.551 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.613 183284 DEBUG nova.compute.manager [req-0c7a66ff-638d-47c7-8826-ce86481e0895 req-20232ccb-2f14-4701-b8d1-2bf23a4d086b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Received event network-vif-plugged-9cf7b098-bd13-4dfa-8995-578571b027a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.614 183284 DEBUG oslo_concurrency.lockutils [req-0c7a66ff-638d-47c7-8826-ce86481e0895 req-20232ccb-2f14-4701-b8d1-2bf23a4d086b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.614 183284 DEBUG oslo_concurrency.lockutils [req-0c7a66ff-638d-47c7-8826-ce86481e0895 req-20232ccb-2f14-4701-b8d1-2bf23a4d086b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.614 183284 DEBUG oslo_concurrency.lockutils [req-0c7a66ff-638d-47c7-8826-ce86481e0895 req-20232ccb-2f14-4701-b8d1-2bf23a4d086b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.615 183284 DEBUG nova.compute.manager [req-0c7a66ff-638d-47c7-8826-ce86481e0895 req-20232ccb-2f14-4701-b8d1-2bf23a4d086b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] No waiting events found dispatching network-vif-plugged-9cf7b098-bd13-4dfa-8995-578571b027a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.615 183284 WARNING nova.compute.manager [req-0c7a66ff-638d-47c7-8826-ce86481e0895 req-20232ccb-2f14-4701-b8d1-2bf23a4d086b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Received unexpected event network-vif-plugged-9cf7b098-bd13-4dfa-8995-578571b027a3 for instance with vm_state active and task_state None.
Jan 21 18:14:11 compute-0 nova_compute[183278]: 2026-01-21 18:14:11.620 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.100 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.101 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.169 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.174 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.234 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.235 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.302 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.439 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.440 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5586MB free_disk=73.35425567626953GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.441 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.441 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.739 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance b499883a-ee9f-4239-b996-4fbaa175bcc3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.740 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.740 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.740 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.796 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.811 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.831 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:14:12 compute-0 nova_compute[183278]: 2026-01-21 18:14:12.832 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:13 compute-0 podman[204623]: 2026-01-21 18:14:13.999131842 +0000 UTC m=+0.052750870 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:14:14 compute-0 nova_compute[183278]: 2026-01-21 18:14:14.135 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:15 compute-0 nova_compute[183278]: 2026-01-21 18:14:15.442 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:16 compute-0 nova_compute[183278]: 2026-01-21 18:14:16.826 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.166 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "c4cf9774-0343-498d-9bca-196666c53830" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.166 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.180 183284 DEBUG nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.238 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.238 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.244 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.244 183284 INFO nova.compute.claims [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.406 183284 DEBUG nova.compute.provider_tree [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.422 183284 DEBUG nova.scheduler.client.report [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.440 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.440 183284 DEBUG nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.488 183284 DEBUG nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.488 183284 DEBUG nova.network.neutron [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.507 183284 INFO nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.525 183284 DEBUG nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.607 183284 DEBUG nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.608 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.609 183284 INFO nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Creating image(s)
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.609 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "/var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.609 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "/var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.610 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "/var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.621 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.677 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.678 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.679 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.692 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.747 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:17 compute-0 nova_compute[183278]: 2026-01-21 18:14:17.748 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.486 183284 DEBUG nova.policy [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16f8ab2ae83b48f9a88753a5deddcc19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.689 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk 1073741824" returned: 0 in 0.940s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.689 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.690 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.745 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.746 183284 DEBUG nova.virt.disk.api [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Checking if we can resize image /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.746 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.802 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.803 183284 DEBUG nova.virt.disk.api [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Cannot resize image /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.804 183284 DEBUG nova.objects.instance [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'migration_context' on Instance uuid c4cf9774-0343-498d-9bca-196666c53830 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.825 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.826 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Ensure instance console log exists: /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.826 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.826 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:18 compute-0 nova_compute[183278]: 2026-01-21 18:14:18.827 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:19 compute-0 nova_compute[183278]: 2026-01-21 18:14:19.136 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:20.070 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:20.070 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:20.071 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:20 compute-0 nova_compute[183278]: 2026-01-21 18:14:20.445 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:21 compute-0 nova_compute[183278]: 2026-01-21 18:14:21.501 183284 DEBUG nova.network.neutron [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Successfully created port: 2471f7dc-ce20-49f1-92be-a9f8b557ae8b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:14:23 compute-0 nova_compute[183278]: 2026-01-21 18:14:23.411 183284 DEBUG nova.network.neutron [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Successfully updated port: 2471f7dc-ce20-49f1-92be-a9f8b557ae8b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:14:23 compute-0 nova_compute[183278]: 2026-01-21 18:14:23.519 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "refresh_cache-c4cf9774-0343-498d-9bca-196666c53830" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:14:23 compute-0 nova_compute[183278]: 2026-01-21 18:14:23.519 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquired lock "refresh_cache-c4cf9774-0343-498d-9bca-196666c53830" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:14:23 compute-0 nova_compute[183278]: 2026-01-21 18:14:23.519 183284 DEBUG nova.network.neutron [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:14:23 compute-0 nova_compute[183278]: 2026-01-21 18:14:23.665 183284 DEBUG nova.compute.manager [req-0a2d1793-1ed8-4f3e-93fe-5628fc1c2532 req-ee7e8f5f-18b9-47ee-ba95-d9edfe09b6c4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Received event network-changed-2471f7dc-ce20-49f1-92be-a9f8b557ae8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:23 compute-0 nova_compute[183278]: 2026-01-21 18:14:23.665 183284 DEBUG nova.compute.manager [req-0a2d1793-1ed8-4f3e-93fe-5628fc1c2532 req-ee7e8f5f-18b9-47ee-ba95-d9edfe09b6c4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Refreshing instance network info cache due to event network-changed-2471f7dc-ce20-49f1-92be-a9f8b557ae8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:14:23 compute-0 nova_compute[183278]: 2026-01-21 18:14:23.665 183284 DEBUG oslo_concurrency.lockutils [req-0a2d1793-1ed8-4f3e-93fe-5628fc1c2532 req-ee7e8f5f-18b9-47ee-ba95-d9edfe09b6c4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-c4cf9774-0343-498d-9bca-196666c53830" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:14:23 compute-0 nova_compute[183278]: 2026-01-21 18:14:23.747 183284 DEBUG nova.network.neutron [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:14:24 compute-0 nova_compute[183278]: 2026-01-21 18:14:24.137 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:24 compute-0 nova_compute[183278]: 2026-01-21 18:14:24.928 183284 DEBUG nova.network.neutron [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Updating instance_info_cache with network_info: [{"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:14:24 compute-0 nova_compute[183278]: 2026-01-21 18:14:24.986 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Releasing lock "refresh_cache-c4cf9774-0343-498d-9bca-196666c53830" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:14:24 compute-0 nova_compute[183278]: 2026-01-21 18:14:24.986 183284 DEBUG nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Instance network_info: |[{"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:14:24 compute-0 nova_compute[183278]: 2026-01-21 18:14:24.987 183284 DEBUG oslo_concurrency.lockutils [req-0a2d1793-1ed8-4f3e-93fe-5628fc1c2532 req-ee7e8f5f-18b9-47ee-ba95-d9edfe09b6c4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-c4cf9774-0343-498d-9bca-196666c53830" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:14:24 compute-0 nova_compute[183278]: 2026-01-21 18:14:24.987 183284 DEBUG nova.network.neutron [req-0a2d1793-1ed8-4f3e-93fe-5628fc1c2532 req-ee7e8f5f-18b9-47ee-ba95-d9edfe09b6c4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Refreshing network info cache for port 2471f7dc-ce20-49f1-92be-a9f8b557ae8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:14:24 compute-0 nova_compute[183278]: 2026-01-21 18:14:24.989 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Start _get_guest_xml network_info=[{"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:14:24 compute-0 nova_compute[183278]: 2026-01-21 18:14:24.993 183284 WARNING nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.003 183284 DEBUG nova.virt.libvirt.host [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.003 183284 DEBUG nova.virt.libvirt.host [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.007 183284 DEBUG nova.virt.libvirt.host [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.007 183284 DEBUG nova.virt.libvirt.host [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.008 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.008 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.009 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.009 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.009 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.009 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.009 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.010 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.010 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.010 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.010 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.010 183284 DEBUG nova.virt.hardware [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.014 183284 DEBUG nova.virt.libvirt.vif [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1653852301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1653852301',id=6,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-pocehibf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:14:17Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=c4cf9774-0343-498d-9bca-196666c53830,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.014 183284 DEBUG nova.network.os_vif_util [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.015 183284 DEBUG nova.network.os_vif_util [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:5e:35,bridge_name='br-int',has_traffic_filtering=True,id=2471f7dc-ce20-49f1-92be-a9f8b557ae8b,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2471f7dc-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.016 183284 DEBUG nova.objects.instance [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'pci_devices' on Instance uuid c4cf9774-0343-498d-9bca-196666c53830 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.047 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <uuid>c4cf9774-0343-498d-9bca-196666c53830</uuid>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <name>instance-00000006</name>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1653852301</nova:name>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:14:24</nova:creationTime>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:14:25 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:14:25 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:14:25 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:14:25 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:14:25 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:14:25 compute-0 nova_compute[183278]:         <nova:user uuid="16f8ab2ae83b48f9a88753a5deddcc19">tempest-TestExecuteActionsViaActuator-627352265-project-member</nova:user>
Jan 21 18:14:25 compute-0 nova_compute[183278]:         <nova:project uuid="2a4b7cdf556d4f8393d1c61b57628813">tempest-TestExecuteActionsViaActuator-627352265</nova:project>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:14:25 compute-0 nova_compute[183278]:         <nova:port uuid="2471f7dc-ce20-49f1-92be-a9f8b557ae8b">
Jan 21 18:14:25 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <system>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <entry name="serial">c4cf9774-0343-498d-9bca-196666c53830</entry>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <entry name="uuid">c4cf9774-0343-498d-9bca-196666c53830</entry>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     </system>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <os>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   </os>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <features>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   </features>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk.config"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:fd:5e:35"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <target dev="tap2471f7dc-ce"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/console.log" append="off"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <video>
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     </video>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:14:25 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:14:25 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:14:25 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:14:25 compute-0 nova_compute[183278]: </domain>
Jan 21 18:14:25 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.049 183284 DEBUG nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Preparing to wait for external event network-vif-plugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.049 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "c4cf9774-0343-498d-9bca-196666c53830-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.050 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.050 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.051 183284 DEBUG nova.virt.libvirt.vif [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1653852301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1653852301',id=6,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-pocehibf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:14:17Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=c4cf9774-0343-498d-9bca-196666c53830,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.051 183284 DEBUG nova.network.os_vif_util [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.052 183284 DEBUG nova.network.os_vif_util [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:5e:35,bridge_name='br-int',has_traffic_filtering=True,id=2471f7dc-ce20-49f1-92be-a9f8b557ae8b,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2471f7dc-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.052 183284 DEBUG os_vif [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:5e:35,bridge_name='br-int',has_traffic_filtering=True,id=2471f7dc-ce20-49f1-92be-a9f8b557ae8b,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2471f7dc-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.053 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.054 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.054 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.058 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.058 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2471f7dc-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.059 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2471f7dc-ce, col_values=(('external_ids', {'iface-id': '2471f7dc-ce20-49f1-92be-a9f8b557ae8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:5e:35', 'vm-uuid': 'c4cf9774-0343-498d-9bca-196666c53830'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.060 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:25 compute-0 NetworkManager[55506]: <info>  [1769019265.0613] manager: (tap2471f7dc-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.063 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.066 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.067 183284 INFO os_vif [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:5e:35,bridge_name='br-int',has_traffic_filtering=True,id=2471f7dc-ce20-49f1-92be-a9f8b557ae8b,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2471f7dc-ce')
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.619 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.621 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.621 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] No VIF found with MAC fa:16:3e:fd:5e:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:14:25 compute-0 nova_compute[183278]: 2026-01-21 18:14:25.622 183284 INFO nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Using config drive
Jan 21 18:14:26 compute-0 ovn_controller[95419]: 2026-01-21T18:14:26Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:14:07 10.100.0.6
Jan 21 18:14:26 compute-0 ovn_controller[95419]: 2026-01-21T18:14:26Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:14:07 10.100.0.6
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.446 183284 INFO nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Creating config drive at /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk.config
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.450 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg29otrbt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.572 183284 DEBUG oslo_concurrency.processutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg29otrbt" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:26 compute-0 kernel: tap2471f7dc-ce: entered promiscuous mode
Jan 21 18:14:26 compute-0 NetworkManager[55506]: <info>  [1769019266.6440] manager: (tap2471f7dc-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.698 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:26 compute-0 ovn_controller[95419]: 2026-01-21T18:14:26Z|00044|binding|INFO|Claiming lport 2471f7dc-ce20-49f1-92be-a9f8b557ae8b for this chassis.
Jan 21 18:14:26 compute-0 ovn_controller[95419]: 2026-01-21T18:14:26Z|00045|binding|INFO|2471f7dc-ce20-49f1-92be-a9f8b557ae8b: Claiming fa:16:3e:fd:5e:35 10.100.0.10
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.706 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:5e:35 10.100.0.10'], port_security=['fa:16:3e:fd:5e:35 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c4cf9774-0343-498d-9bca-196666c53830', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8161199-7513-4099-89c4-00e7e075c92b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c5ea7560-106a-40fd-a00a-355d8be6545e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afee9644-a390-49fb-b346-3fd1c948feef, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=2471f7dc-ce20-49f1-92be-a9f8b557ae8b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.707 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 2471f7dc-ce20-49f1-92be-a9f8b557ae8b in datapath e8161199-7513-4099-89c4-00e7e075c92b bound to our chassis
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.708 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:14:26 compute-0 ovn_controller[95419]: 2026-01-21T18:14:26Z|00046|binding|INFO|Setting lport 2471f7dc-ce20-49f1-92be-a9f8b557ae8b ovn-installed in OVS
Jan 21 18:14:26 compute-0 ovn_controller[95419]: 2026-01-21T18:14:26Z|00047|binding|INFO|Setting lport 2471f7dc-ce20-49f1-92be-a9f8b557ae8b up in Southbound
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.716 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:26 compute-0 systemd-machined[154592]: New machine qemu-4-instance-00000006.
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.725 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[71028926-fb4f-4831-9a52-52d92e1b1b67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:26 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000006.
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.760 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[4c1a672e-d85b-44b5-9698-02cb7ae00614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.763 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[94e259dc-36d6-4445-9dfc-05a6a6876c57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:26 compute-0 systemd-udevd[204703]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:14:26 compute-0 NetworkManager[55506]: <info>  [1769019266.7795] device (tap2471f7dc-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:14:26 compute-0 NetworkManager[55506]: <info>  [1769019266.7804] device (tap2471f7dc-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.795 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[12d08d43-17c0-44d6-bc31-4a526a7ece94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.799 183284 DEBUG nova.network.neutron [req-0a2d1793-1ed8-4f3e-93fe-5628fc1c2532 req-ee7e8f5f-18b9-47ee-ba95-d9edfe09b6c4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Updated VIF entry in instance network info cache for port 2471f7dc-ce20-49f1-92be-a9f8b557ae8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.799 183284 DEBUG nova.network.neutron [req-0a2d1793-1ed8-4f3e-93fe-5628fc1c2532 req-ee7e8f5f-18b9-47ee-ba95-d9edfe09b6c4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Updating instance_info_cache with network_info: [{"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.813 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[78c77822-00c2-477a-b24d-acce320b802a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8161199-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:ce:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378321, 'reachable_time': 16740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 204713, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.825 183284 DEBUG oslo_concurrency.lockutils [req-0a2d1793-1ed8-4f3e-93fe-5628fc1c2532 req-ee7e8f5f-18b9-47ee-ba95-d9edfe09b6c4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-c4cf9774-0343-498d-9bca-196666c53830" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.826 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[952535e2-8026-4309-a8e8-3335972d48fb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape8161199-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378331, 'tstamp': 378331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204714, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape8161199-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378334, 'tstamp': 378334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204714, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.828 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8161199-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.829 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.830 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.830 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8161199-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.831 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.831 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape8161199-70, col_values=(('external_ids', {'iface-id': 'd5993779-4a27-48a2-a904-ec457f58cb35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:26 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:26.831 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.894 183284 DEBUG nova.compute.manager [req-c9167797-de6a-43de-9984-f672aedc5a63 req-51315a65-e4d0-4679-bba4-bc3e2bf7b1bb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Received event network-vif-plugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.894 183284 DEBUG oslo_concurrency.lockutils [req-c9167797-de6a-43de-9984-f672aedc5a63 req-51315a65-e4d0-4679-bba4-bc3e2bf7b1bb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c4cf9774-0343-498d-9bca-196666c53830-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.895 183284 DEBUG oslo_concurrency.lockutils [req-c9167797-de6a-43de-9984-f672aedc5a63 req-51315a65-e4d0-4679-bba4-bc3e2bf7b1bb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.895 183284 DEBUG oslo_concurrency.lockutils [req-c9167797-de6a-43de-9984-f672aedc5a63 req-51315a65-e4d0-4679-bba4-bc3e2bf7b1bb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:26 compute-0 nova_compute[183278]: 2026-01-21 18:14:26.895 183284 DEBUG nova.compute.manager [req-c9167797-de6a-43de-9984-f672aedc5a63 req-51315a65-e4d0-4679-bba4-bc3e2bf7b1bb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Processing event network-vif-plugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.076 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019267.075727, c4cf9774-0343-498d-9bca-196666c53830 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.077 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] VM Started (Lifecycle Event)
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.079 183284 DEBUG nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.084 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.089 183284 INFO nova.virt.libvirt.driver [-] [instance: c4cf9774-0343-498d-9bca-196666c53830] Instance spawned successfully.
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.089 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.093 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.113 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.117 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.118 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.118 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.118 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.119 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.119 183284 DEBUG nova.virt.libvirt.driver [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.148 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.149 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019267.076852, c4cf9774-0343-498d-9bca-196666c53830 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.149 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] VM Paused (Lifecycle Event)
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.195 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.199 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019267.0824413, c4cf9774-0343-498d-9bca-196666c53830 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.199 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] VM Resumed (Lifecycle Event)
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.205 183284 INFO nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Took 9.60 seconds to spawn the instance on the hypervisor.
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.205 183284 DEBUG nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.231 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.235 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.275 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.286 183284 INFO nova.compute.manager [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Took 10.06 seconds to build instance.
Jan 21 18:14:27 compute-0 nova_compute[183278]: 2026-01-21 18:14:27.300 183284 DEBUG oslo_concurrency.lockutils [None req-80ac4ed2-c220-4748-8661-83b9b391fc41 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:29 compute-0 nova_compute[183278]: 2026-01-21 18:14:29.333 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:29 compute-0 nova_compute[183278]: 2026-01-21 18:14:29.338 183284 DEBUG nova.compute.manager [req-a01d1fbb-0397-43a0-834f-c92bb4cce0d3 req-0fc7dc98-5dc8-4e3f-8776-9f6af4ce08c3 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Received event network-vif-plugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:29 compute-0 nova_compute[183278]: 2026-01-21 18:14:29.338 183284 DEBUG oslo_concurrency.lockutils [req-a01d1fbb-0397-43a0-834f-c92bb4cce0d3 req-0fc7dc98-5dc8-4e3f-8776-9f6af4ce08c3 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c4cf9774-0343-498d-9bca-196666c53830-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:29 compute-0 nova_compute[183278]: 2026-01-21 18:14:29.338 183284 DEBUG oslo_concurrency.lockutils [req-a01d1fbb-0397-43a0-834f-c92bb4cce0d3 req-0fc7dc98-5dc8-4e3f-8776-9f6af4ce08c3 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:29 compute-0 nova_compute[183278]: 2026-01-21 18:14:29.339 183284 DEBUG oslo_concurrency.lockutils [req-a01d1fbb-0397-43a0-834f-c92bb4cce0d3 req-0fc7dc98-5dc8-4e3f-8776-9f6af4ce08c3 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:29 compute-0 nova_compute[183278]: 2026-01-21 18:14:29.339 183284 DEBUG nova.compute.manager [req-a01d1fbb-0397-43a0-834f-c92bb4cce0d3 req-0fc7dc98-5dc8-4e3f-8776-9f6af4ce08c3 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] No waiting events found dispatching network-vif-plugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:29 compute-0 nova_compute[183278]: 2026-01-21 18:14:29.339 183284 WARNING nova.compute.manager [req-a01d1fbb-0397-43a0-834f-c92bb4cce0d3 req-0fc7dc98-5dc8-4e3f-8776-9f6af4ce08c3 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Received unexpected event network-vif-plugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b for instance with vm_state active and task_state None.
Jan 21 18:14:29 compute-0 podman[192560]: time="2026-01-21T18:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:14:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:14:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Jan 21 18:14:30 compute-0 nova_compute[183278]: 2026-01-21 18:14:30.068 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:31 compute-0 openstack_network_exporter[195402]: ERROR   18:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:14:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:14:31 compute-0 openstack_network_exporter[195402]: ERROR   18:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:14:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:14:34 compute-0 podman[204722]: 2026-01-21 18:14:34.035849829 +0000 UTC m=+0.088072177 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:14:34 compute-0 nova_compute[183278]: 2026-01-21 18:14:34.144 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:35 compute-0 nova_compute[183278]: 2026-01-21 18:14:35.071 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:38 compute-0 nova_compute[183278]: 2026-01-21 18:14:38.268 183284 DEBUG nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Check if temp file /var/lib/nova/instances/tmpta6zttmf exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 18:14:38 compute-0 nova_compute[183278]: 2026-01-21 18:14:38.270 183284 DEBUG nova.compute.manager [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpta6zttmf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b499883a-ee9f-4239-b996-4fbaa175bcc3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 18:14:38 compute-0 nova_compute[183278]: 2026-01-21 18:14:38.745 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:38 compute-0 nova_compute[183278]: 2026-01-21 18:14:38.799 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:38 compute-0 nova_compute[183278]: 2026-01-21 18:14:38.800 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:38 compute-0 nova_compute[183278]: 2026-01-21 18:14:38.875 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:39 compute-0 nova_compute[183278]: 2026-01-21 18:14:39.145 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:40 compute-0 nova_compute[183278]: 2026-01-21 18:14:40.074 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:41 compute-0 podman[204751]: 2026-01-21 18:14:41.062429467 +0000 UTC m=+0.081725743 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:14:41 compute-0 podman[204750]: 2026-01-21 18:14:41.063906613 +0000 UTC m=+0.112770096 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:14:42 compute-0 ovn_controller[95419]: 2026-01-21T18:14:42Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:5e:35 10.100.0.10
Jan 21 18:14:42 compute-0 ovn_controller[95419]: 2026-01-21T18:14:42Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:5e:35 10.100.0.10
Jan 21 18:14:43 compute-0 sshd-session[204819]: Accepted publickey for nova from 192.168.122.101 port 56532 ssh2: ECDSA SHA256:29a5JNhHHz2bb0ACqZTr6qOKeSRnhiTRA8SK+rzn9gs
Jan 21 18:14:43 compute-0 systemd-logind[782]: New session 28 of user nova.
Jan 21 18:14:43 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:14:43 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:14:43 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:14:43 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:14:43 compute-0 systemd[204823]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:14:43 compute-0 systemd[204823]: Queued start job for default target Main User Target.
Jan 21 18:14:43 compute-0 systemd[204823]: Created slice User Application Slice.
Jan 21 18:14:43 compute-0 systemd[204823]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:14:43 compute-0 systemd[204823]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:14:43 compute-0 systemd[204823]: Reached target Paths.
Jan 21 18:14:43 compute-0 systemd[204823]: Reached target Timers.
Jan 21 18:14:43 compute-0 systemd[204823]: Starting D-Bus User Message Bus Socket...
Jan 21 18:14:43 compute-0 systemd[204823]: Starting Create User's Volatile Files and Directories...
Jan 21 18:14:43 compute-0 systemd[204823]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:14:43 compute-0 systemd[204823]: Finished Create User's Volatile Files and Directories.
Jan 21 18:14:43 compute-0 systemd[204823]: Reached target Sockets.
Jan 21 18:14:43 compute-0 systemd[204823]: Reached target Basic System.
Jan 21 18:14:43 compute-0 systemd[204823]: Reached target Main User Target.
Jan 21 18:14:43 compute-0 systemd[204823]: Startup finished in 134ms.
Jan 21 18:14:43 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:14:43 compute-0 systemd[1]: Started Session 28 of User nova.
Jan 21 18:14:43 compute-0 sshd-session[204819]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:14:43 compute-0 sshd-session[204838]: Received disconnect from 192.168.122.101 port 56532:11: disconnected by user
Jan 21 18:14:43 compute-0 sshd-session[204838]: Disconnected from user nova 192.168.122.101 port 56532
Jan 21 18:14:43 compute-0 sshd-session[204819]: pam_unix(sshd:session): session closed for user nova
Jan 21 18:14:43 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Jan 21 18:14:43 compute-0 systemd-logind[782]: Session 28 logged out. Waiting for processes to exit.
Jan 21 18:14:43 compute-0 systemd-logind[782]: Removed session 28.
Jan 21 18:14:44 compute-0 nova_compute[183278]: 2026-01-21 18:14:44.147 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:44 compute-0 nova_compute[183278]: 2026-01-21 18:14:44.652 183284 DEBUG nova.compute.manager [req-58b69a1d-2911-43ea-8d76-a3ba4277d06f req-a5b14977-f0e4-4668-8f0c-aca68d1a1fb0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-unplugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:44 compute-0 nova_compute[183278]: 2026-01-21 18:14:44.652 183284 DEBUG oslo_concurrency.lockutils [req-58b69a1d-2911-43ea-8d76-a3ba4277d06f req-a5b14977-f0e4-4668-8f0c-aca68d1a1fb0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:44 compute-0 nova_compute[183278]: 2026-01-21 18:14:44.652 183284 DEBUG oslo_concurrency.lockutils [req-58b69a1d-2911-43ea-8d76-a3ba4277d06f req-a5b14977-f0e4-4668-8f0c-aca68d1a1fb0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:44 compute-0 nova_compute[183278]: 2026-01-21 18:14:44.653 183284 DEBUG oslo_concurrency.lockutils [req-58b69a1d-2911-43ea-8d76-a3ba4277d06f req-a5b14977-f0e4-4668-8f0c-aca68d1a1fb0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:44 compute-0 nova_compute[183278]: 2026-01-21 18:14:44.653 183284 DEBUG nova.compute.manager [req-58b69a1d-2911-43ea-8d76-a3ba4277d06f req-a5b14977-f0e4-4668-8f0c-aca68d1a1fb0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] No waiting events found dispatching network-vif-unplugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:44 compute-0 nova_compute[183278]: 2026-01-21 18:14:44.653 183284 DEBUG nova.compute.manager [req-58b69a1d-2911-43ea-8d76-a3ba4277d06f req-a5b14977-f0e4-4668-8f0c-aca68d1a1fb0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-unplugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:14:45 compute-0 podman[204840]: 2026-01-21 18:14:45.074302493 +0000 UTC m=+0.049313067 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.076 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.554 183284 INFO nova.compute.manager [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Took 6.68 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.555 183284 DEBUG nova.compute.manager [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.572 183284 DEBUG nova.compute.manager [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpta6zttmf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b499883a-ee9f-4239-b996-4fbaa175bcc3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1e41caa8-4d5a-4a21-bbec-18023bbd02ab),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.592 183284 DEBUG nova.objects.instance [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid b499883a-ee9f-4239-b996-4fbaa175bcc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.593 183284 DEBUG nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.595 183284 DEBUG nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.595 183284 DEBUG nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.611 183284 DEBUG nova.virt.libvirt.vif [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:13:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1264591926',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1264591926',id=3,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:13:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-9cw14hr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:13:40Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=b499883a-ee9f-4239-b996-4fbaa175bcc3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.611 183284 DEBUG nova.network.os_vif_util [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.612 183284 DEBUG nova.network.os_vif_util [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:dd:15,bridge_name='br-int',has_traffic_filtering=True,id=46d8fa01-a7bd-4849-904d-01dc9c7071a4,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46d8fa01-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.612 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 18:14:45 compute-0 nova_compute[183278]:   <mac address="fa:16:3e:39:dd:15"/>
Jan 21 18:14:45 compute-0 nova_compute[183278]:   <model type="virtio"/>
Jan 21 18:14:45 compute-0 nova_compute[183278]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:14:45 compute-0 nova_compute[183278]:   <mtu size="1442"/>
Jan 21 18:14:45 compute-0 nova_compute[183278]:   <target dev="tap46d8fa01-a7"/>
Jan 21 18:14:45 compute-0 nova_compute[183278]: </interface>
Jan 21 18:14:45 compute-0 nova_compute[183278]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 18:14:45 compute-0 nova_compute[183278]: 2026-01-21 18:14:45.613 183284 DEBUG nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.098 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.099 183284 INFO nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.202 183284 INFO nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.705 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.705 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.744 183284 DEBUG nova.compute.manager [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.744 183284 DEBUG oslo_concurrency.lockutils [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.744 183284 DEBUG oslo_concurrency.lockutils [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.745 183284 DEBUG oslo_concurrency.lockutils [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.745 183284 DEBUG nova.compute.manager [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] No waiting events found dispatching network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.745 183284 WARNING nova.compute.manager [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received unexpected event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 for instance with vm_state active and task_state migrating.
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.745 183284 DEBUG nova.compute.manager [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-changed-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.746 183284 DEBUG nova.compute.manager [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Refreshing instance network info cache due to event network-changed-46d8fa01-a7bd-4849-904d-01dc9c7071a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.746 183284 DEBUG oslo_concurrency.lockutils [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.746 183284 DEBUG oslo_concurrency.lockutils [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:14:46 compute-0 nova_compute[183278]: 2026-01-21 18:14:46.746 183284 DEBUG nova.network.neutron [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Refreshing network info cache for port 46d8fa01-a7bd-4849-904d-01dc9c7071a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:14:47 compute-0 nova_compute[183278]: 2026-01-21 18:14:47.209 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:14:47 compute-0 nova_compute[183278]: 2026-01-21 18:14:47.209 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:14:47 compute-0 nova_compute[183278]: 2026-01-21 18:14:47.712 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:14:47 compute-0 nova_compute[183278]: 2026-01-21 18:14:47.713 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:14:47 compute-0 nova_compute[183278]: 2026-01-21 18:14:47.732 183284 DEBUG nova.network.neutron [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Updated VIF entry in instance network info cache for port 46d8fa01-a7bd-4849-904d-01dc9c7071a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:14:47 compute-0 nova_compute[183278]: 2026-01-21 18:14:47.732 183284 DEBUG nova.network.neutron [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Updating instance_info_cache with network_info: [{"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:14:47 compute-0 nova_compute[183278]: 2026-01-21 18:14:47.757 183284 DEBUG oslo_concurrency.lockutils [req-b9ffdc32-c7c2-41b0-9277-fdae3eeaff2c req-baff51b0-08d7-424e-ade2-fec2aadc259a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-b499883a-ee9f-4239-b996-4fbaa175bcc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:14:48 compute-0 nova_compute[183278]: 2026-01-21 18:14:48.217 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:14:48 compute-0 nova_compute[183278]: 2026-01-21 18:14:48.217 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:14:48 compute-0 nova_compute[183278]: 2026-01-21 18:14:48.853 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019288.8528752, b499883a-ee9f-4239-b996-4fbaa175bcc3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:14:48 compute-0 nova_compute[183278]: 2026-01-21 18:14:48.854 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] VM Paused (Lifecycle Event)
Jan 21 18:14:48 compute-0 nova_compute[183278]: 2026-01-21 18:14:48.856 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:14:48 compute-0 nova_compute[183278]: 2026-01-21 18:14:48.856 183284 DEBUG nova.virt.libvirt.migration [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:14:48 compute-0 nova_compute[183278]: 2026-01-21 18:14:48.885 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:14:48 compute-0 nova_compute[183278]: 2026-01-21 18:14:48.891 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:14:48 compute-0 nova_compute[183278]: 2026-01-21 18:14:48.910 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.149 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:49 compute-0 kernel: tap46d8fa01-a7 (unregistering): left promiscuous mode
Jan 21 18:14:49 compute-0 NetworkManager[55506]: <info>  [1769019289.2124] device (tap46d8fa01-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:14:49 compute-0 ovn_controller[95419]: 2026-01-21T18:14:49Z|00048|binding|INFO|Releasing lport 46d8fa01-a7bd-4849-904d-01dc9c7071a4 from this chassis (sb_readonly=0)
Jan 21 18:14:49 compute-0 ovn_controller[95419]: 2026-01-21T18:14:49Z|00049|binding|INFO|Setting lport 46d8fa01-a7bd-4849-904d-01dc9c7071a4 down in Southbound
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.221 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:49 compute-0 ovn_controller[95419]: 2026-01-21T18:14:49Z|00050|binding|INFO|Removing iface tap46d8fa01-a7 ovn-installed in OVS
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.225 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.230 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:dd:15 10.100.0.12'], port_security=['fa:16:3e:39:dd:15 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '88a62794-b4a4-47e3-9cce-91e574e684c1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b499883a-ee9f-4239-b996-4fbaa175bcc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8161199-7513-4099-89c4-00e7e075c92b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c5ea7560-106a-40fd-a00a-355d8be6545e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afee9644-a390-49fb-b346-3fd1c948feef, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=46d8fa01-a7bd-4849-904d-01dc9c7071a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.231 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 46d8fa01-a7bd-4849-904d-01dc9c7071a4 in datapath e8161199-7513-4099-89c4-00e7e075c92b unbound from our chassis
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.233 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.235 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.250 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b2371a05-e873-4445-9dc5-a3340a46abed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.277 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[4b6355fc-a309-4bb0-9608-8ad8454bf903]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.280 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[d578d3c1-042a-4aac-8429-0b8045e5cc3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:49 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 21 18:14:49 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Consumed 15.543s CPU time.
Jan 21 18:14:49 compute-0 systemd-machined[154592]: Machine qemu-2-instance-00000003 terminated.
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.309 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9cdca3-c69d-4643-adb1-0c147678fc41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.326 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4414d2cc-3b3d-4b0c-9682-c7f3b6db0f15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8161199-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:ce:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 9, 'rx_bytes': 742, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 9, 'rx_bytes': 742, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378321, 'reachable_time': 16740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 204895, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.341 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7b66b284-f9f8-4e2b-ab0c-5b32d7b725de]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape8161199-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378331, 'tstamp': 378331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204896, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape8161199-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378334, 'tstamp': 378334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204896, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.343 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8161199-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.345 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.350 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.350 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8161199-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.350 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.351 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape8161199-70, col_values=(('external_ids', {'iface-id': 'd5993779-4a27-48a2-a904-ec457f58cb35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:49.351 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.439 183284 DEBUG nova.virt.libvirt.guest [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.439 183284 INFO nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Migration operation has completed
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.440 183284 INFO nova.compute.manager [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] _post_live_migration() is started..
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.443 183284 DEBUG nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.443 183284 DEBUG nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.444 183284 DEBUG nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.640 183284 DEBUG nova.compute.manager [req-a40978a2-cf89-4eba-a8e0-8a8f112097ca req-b6db1407-d6ea-48e4-999b-34d2e8f4d365 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-unplugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.641 183284 DEBUG oslo_concurrency.lockutils [req-a40978a2-cf89-4eba-a8e0-8a8f112097ca req-b6db1407-d6ea-48e4-999b-34d2e8f4d365 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.642 183284 DEBUG oslo_concurrency.lockutils [req-a40978a2-cf89-4eba-a8e0-8a8f112097ca req-b6db1407-d6ea-48e4-999b-34d2e8f4d365 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.642 183284 DEBUG oslo_concurrency.lockutils [req-a40978a2-cf89-4eba-a8e0-8a8f112097ca req-b6db1407-d6ea-48e4-999b-34d2e8f4d365 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.642 183284 DEBUG nova.compute.manager [req-a40978a2-cf89-4eba-a8e0-8a8f112097ca req-b6db1407-d6ea-48e4-999b-34d2e8f4d365 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] No waiting events found dispatching network-vif-unplugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:49 compute-0 nova_compute[183278]: 2026-01-21 18:14:49.642 183284 DEBUG nova.compute.manager [req-a40978a2-cf89-4eba-a8e0-8a8f112097ca req-b6db1407-d6ea-48e4-999b-34d2e8f4d365 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-unplugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.079 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.954 183284 DEBUG nova.network.neutron [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Activated binding for port 46d8fa01-a7bd-4849-904d-01dc9c7071a4 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.954 183284 DEBUG nova.compute.manager [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.955 183284 DEBUG nova.virt.libvirt.vif [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:13:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1264591926',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1264591926',id=3,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:13:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-9cw14hr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:14:36Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=b499883a-ee9f-4239-b996-4fbaa175bcc3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.955 183284 DEBUG nova.network.os_vif_util [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "address": "fa:16:3e:39:dd:15", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46d8fa01-a7", "ovs_interfaceid": "46d8fa01-a7bd-4849-904d-01dc9c7071a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.956 183284 DEBUG nova.network.os_vif_util [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:dd:15,bridge_name='br-int',has_traffic_filtering=True,id=46d8fa01-a7bd-4849-904d-01dc9c7071a4,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46d8fa01-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.956 183284 DEBUG os_vif [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:dd:15,bridge_name='br-int',has_traffic_filtering=True,id=46d8fa01-a7bd-4849-904d-01dc9c7071a4,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46d8fa01-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.958 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.958 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap46d8fa01-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.959 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.961 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.963 183284 INFO os_vif [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:dd:15,bridge_name='br-int',has_traffic_filtering=True,id=46d8fa01-a7bd-4849-904d-01dc9c7071a4,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46d8fa01-a7')
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.964 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.964 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.964 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.964 183284 DEBUG nova.compute.manager [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.965 183284 INFO nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Deleting instance files /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3_del
Jan 21 18:14:50 compute-0 nova_compute[183278]: 2026-01-21 18:14:50.965 183284 INFO nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Deletion of /var/lib/nova/instances/b499883a-ee9f-4239-b996-4fbaa175bcc3_del complete
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.757 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.758 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.758 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.758 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.758 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] No waiting events found dispatching network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.759 183284 WARNING nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received unexpected event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 for instance with vm_state active and task_state migrating.
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.759 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.759 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.759 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.759 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.759 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] No waiting events found dispatching network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.760 183284 WARNING nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received unexpected event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 for instance with vm_state active and task_state migrating.
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.760 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-unplugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.760 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.760 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.760 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.760 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] No waiting events found dispatching network-vif-unplugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.760 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-unplugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.761 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.761 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.761 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.761 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.761 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] No waiting events found dispatching network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.761 183284 WARNING nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received unexpected event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 for instance with vm_state active and task_state migrating.
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.761 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.762 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.762 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.762 183284 DEBUG oslo_concurrency.lockutils [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.762 183284 DEBUG nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] No waiting events found dispatching network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:14:51 compute-0 nova_compute[183278]: 2026-01-21 18:14:51.762 183284 WARNING nova.compute.manager [req-af41c48c-4ec4-42c2-9ff0-a1d42beba620 req-56b5e294-94da-4638-9f18-b37761b34226 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Received unexpected event network-vif-plugged-46d8fa01-a7bd-4849-904d-01dc9c7071a4 for instance with vm_state active and task_state migrating.
Jan 21 18:14:53 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:14:53 compute-0 systemd[204823]: Activating special unit Exit the Session...
Jan 21 18:14:53 compute-0 systemd[204823]: Stopped target Main User Target.
Jan 21 18:14:53 compute-0 systemd[204823]: Stopped target Basic System.
Jan 21 18:14:53 compute-0 systemd[204823]: Stopped target Paths.
Jan 21 18:14:53 compute-0 systemd[204823]: Stopped target Sockets.
Jan 21 18:14:53 compute-0 systemd[204823]: Stopped target Timers.
Jan 21 18:14:53 compute-0 systemd[204823]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:14:53 compute-0 systemd[204823]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:14:53 compute-0 systemd[204823]: Closed D-Bus User Message Bus Socket.
Jan 21 18:14:53 compute-0 systemd[204823]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:14:53 compute-0 systemd[204823]: Removed slice User Application Slice.
Jan 21 18:14:53 compute-0 systemd[204823]: Reached target Shutdown.
Jan 21 18:14:53 compute-0 systemd[204823]: Finished Exit the Session.
Jan 21 18:14:53 compute-0 systemd[204823]: Reached target Exit the Session.
Jan 21 18:14:53 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:14:53 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:14:53 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:14:53 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:14:53 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:14:53 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:14:53 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:14:54 compute-0 nova_compute[183278]: 2026-01-21 18:14:54.150 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:55 compute-0 nova_compute[183278]: 2026-01-21 18:14:55.703 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:55 compute-0 nova_compute[183278]: 2026-01-21 18:14:55.703 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:55 compute-0 nova_compute[183278]: 2026-01-21 18:14:55.704 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b499883a-ee9f-4239-b996-4fbaa175bcc3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:55 compute-0 nova_compute[183278]: 2026-01-21 18:14:55.745 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:55 compute-0 nova_compute[183278]: 2026-01-21 18:14:55.746 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:55 compute-0 nova_compute[183278]: 2026-01-21 18:14:55.746 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:55 compute-0 nova_compute[183278]: 2026-01-21 18:14:55.746 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:14:55 compute-0 nova_compute[183278]: 2026-01-21 18:14:55.961 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.214 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.272 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.273 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.325 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.331 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.387 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.387 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.455 183284 DEBUG oslo_concurrency.processutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.655 183284 WARNING nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.656 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5551MB free_disk=73.32627868652344GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.656 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.657 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:56 compute-0 ovn_controller[95419]: 2026-01-21T18:14:56Z|00051|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.709 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration for instance b499883a-ee9f-4239-b996-4fbaa175bcc3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.743 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.780 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration 1e41caa8-4d5a-4a21-bbec-18023bbd02ab is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.781 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Instance b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.781 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Instance c4cf9774-0343-498d-9bca-196666c53830 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.781 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.781 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.860 183284 DEBUG nova.compute.provider_tree [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.877 183284 DEBUG nova.scheduler.client.report [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.894 183284 DEBUG nova.compute.resource_tracker [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.894 183284 DEBUG oslo_concurrency.lockutils [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.900 183284 INFO nova.compute.manager [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.970 183284 INFO nova.scheduler.client.report [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Deleted allocation for migration 1e41caa8-4d5a-4a21-bbec-18023bbd02ab
Jan 21 18:14:56 compute-0 nova_compute[183278]: 2026-01-21 18:14:56.971 183284 DEBUG nova.virt.libvirt.driver [None req-70610d6f-6544-4c41-b338-5c0d14c970a5 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.512 183284 DEBUG oslo_concurrency.lockutils [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "c4cf9774-0343-498d-9bca-196666c53830" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.512 183284 DEBUG oslo_concurrency.lockutils [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.512 183284 DEBUG oslo_concurrency.lockutils [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "c4cf9774-0343-498d-9bca-196666c53830-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.513 183284 DEBUG oslo_concurrency.lockutils [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.513 183284 DEBUG oslo_concurrency.lockutils [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.514 183284 INFO nova.compute.manager [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Terminating instance
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.516 183284 DEBUG nova.compute.manager [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:14:58 compute-0 kernel: tap2471f7dc-ce (unregistering): left promiscuous mode
Jan 21 18:14:58 compute-0 NetworkManager[55506]: <info>  [1769019298.5431] device (tap2471f7dc-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.550 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:58 compute-0 ovn_controller[95419]: 2026-01-21T18:14:58Z|00052|binding|INFO|Releasing lport 2471f7dc-ce20-49f1-92be-a9f8b557ae8b from this chassis (sb_readonly=0)
Jan 21 18:14:58 compute-0 ovn_controller[95419]: 2026-01-21T18:14:58Z|00053|binding|INFO|Setting lport 2471f7dc-ce20-49f1-92be-a9f8b557ae8b down in Southbound
Jan 21 18:14:58 compute-0 ovn_controller[95419]: 2026-01-21T18:14:58Z|00054|binding|INFO|Removing iface tap2471f7dc-ce ovn-installed in OVS
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.560 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:5e:35 10.100.0.10'], port_security=['fa:16:3e:fd:5e:35 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c4cf9774-0343-498d-9bca-196666c53830', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8161199-7513-4099-89c4-00e7e075c92b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c5ea7560-106a-40fd-a00a-355d8be6545e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afee9644-a390-49fb-b346-3fd1c948feef, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=2471f7dc-ce20-49f1-92be-a9f8b557ae8b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.562 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 2471f7dc-ce20-49f1-92be-a9f8b557ae8b in datapath e8161199-7513-4099-89c4-00e7e075c92b unbound from our chassis
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.563 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.564 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e8161199-7513-4099-89c4-00e7e075c92b
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.581 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[603611ca-4eb4-4edb-860a-96c7cee4a97a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:58 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 21 18:14:58 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Consumed 15.984s CPU time.
Jan 21 18:14:58 compute-0 systemd-machined[154592]: Machine qemu-4-instance-00000006 terminated.
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.607 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[30317c56-eaae-45cd-83f3-3527cf7d181d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.609 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[a2b662b5-56a6-42da-9428-5d6c8374878b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.629 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd32245-b5ce-489c-8d18-8a68ecb666aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.645 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[12b19070-5f08-4589-afeb-18c940c11747]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8161199-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:ce:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378321, 'reachable_time': 16740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 204941, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.659 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4596fd1f-b04b-44cf-a0c4-6e5ba44780f8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape8161199-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378331, 'tstamp': 378331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204942, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape8161199-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378334, 'tstamp': 378334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204942, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.661 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8161199-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.662 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.666 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.666 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8161199-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.666 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.667 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape8161199-70, col_values=(('external_ids', {'iface-id': 'd5993779-4a27-48a2-a904-ec457f58cb35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:14:58.667 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.782 183284 INFO nova.virt.libvirt.driver [-] [instance: c4cf9774-0343-498d-9bca-196666c53830] Instance destroyed successfully.
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.783 183284 DEBUG nova.objects.instance [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'resources' on Instance uuid c4cf9774-0343-498d-9bca-196666c53830 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.799 183284 DEBUG nova.virt.libvirt.vif [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:14:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1653852301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1653852301',id=6,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:14:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-pocehibf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:14:27Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=c4cf9774-0343-498d-9bca-196666c53830,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.799 183284 DEBUG nova.network.os_vif_util [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "address": "fa:16:3e:fd:5e:35", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2471f7dc-ce", "ovs_interfaceid": "2471f7dc-ce20-49f1-92be-a9f8b557ae8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.801 183284 DEBUG nova.network.os_vif_util [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:5e:35,bridge_name='br-int',has_traffic_filtering=True,id=2471f7dc-ce20-49f1-92be-a9f8b557ae8b,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2471f7dc-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.802 183284 DEBUG os_vif [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:5e:35,bridge_name='br-int',has_traffic_filtering=True,id=2471f7dc-ce20-49f1-92be-a9f8b557ae8b,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2471f7dc-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.804 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.805 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2471f7dc-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.853 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.856 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.859 183284 INFO os_vif [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:5e:35,bridge_name='br-int',has_traffic_filtering=True,id=2471f7dc-ce20-49f1-92be-a9f8b557ae8b,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2471f7dc-ce')
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.860 183284 INFO nova.virt.libvirt.driver [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Deleting instance files /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830_del
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.862 183284 INFO nova.virt.libvirt.driver [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Deletion of /var/lib/nova/instances/c4cf9774-0343-498d-9bca-196666c53830_del complete
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.925 183284 INFO nova.compute.manager [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.925 183284 DEBUG oslo.service.loopingcall [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.925 183284 DEBUG nova.compute.manager [-] [instance: c4cf9774-0343-498d-9bca-196666c53830] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:14:58 compute-0 nova_compute[183278]: 2026-01-21 18:14:58.926 183284 DEBUG nova.network.neutron [-] [instance: c4cf9774-0343-498d-9bca-196666c53830] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:14:59 compute-0 nova_compute[183278]: 2026-01-21 18:14:59.152 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:00 compute-0 podman[192560]: time="2026-01-21T18:15:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:15:00 compute-0 nova_compute[183278]: 2026-01-21 18:15:00.113 183284 DEBUG nova.compute.manager [req-ecc50bcb-baec-4bb2-9d3f-bb6f19cbe73d req-2d992ac1-c815-4f7b-91dc-d424e0c448af 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Received event network-vif-unplugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:15:00 compute-0 nova_compute[183278]: 2026-01-21 18:15:00.114 183284 DEBUG oslo_concurrency.lockutils [req-ecc50bcb-baec-4bb2-9d3f-bb6f19cbe73d req-2d992ac1-c815-4f7b-91dc-d424e0c448af 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c4cf9774-0343-498d-9bca-196666c53830-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:00 compute-0 nova_compute[183278]: 2026-01-21 18:15:00.114 183284 DEBUG oslo_concurrency.lockutils [req-ecc50bcb-baec-4bb2-9d3f-bb6f19cbe73d req-2d992ac1-c815-4f7b-91dc-d424e0c448af 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:00 compute-0 nova_compute[183278]: 2026-01-21 18:15:00.114 183284 DEBUG oslo_concurrency.lockutils [req-ecc50bcb-baec-4bb2-9d3f-bb6f19cbe73d req-2d992ac1-c815-4f7b-91dc-d424e0c448af 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:00 compute-0 nova_compute[183278]: 2026-01-21 18:15:00.114 183284 DEBUG nova.compute.manager [req-ecc50bcb-baec-4bb2-9d3f-bb6f19cbe73d req-2d992ac1-c815-4f7b-91dc-d424e0c448af 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] No waiting events found dispatching network-vif-unplugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:15:00 compute-0 nova_compute[183278]: 2026-01-21 18:15:00.115 183284 DEBUG nova.compute.manager [req-ecc50bcb-baec-4bb2-9d3f-bb6f19cbe73d req-2d992ac1-c815-4f7b-91dc-d424e0c448af 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Received event network-vif-unplugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:15:00 compute-0 podman[192560]: @ - - [21/Jan/2026:18:15:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:15:00 compute-0 podman[192560]: @ - - [21/Jan/2026:18:15:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Jan 21 18:15:00 compute-0 nova_compute[183278]: 2026-01-21 18:15:00.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:00 compute-0 nova_compute[183278]: 2026-01-21 18:15:00.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 18:15:01 compute-0 openstack_network_exporter[195402]: ERROR   18:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:15:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:15:01 compute-0 openstack_network_exporter[195402]: ERROR   18:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:15:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.499 183284 DEBUG nova.network.neutron [-] [instance: c4cf9774-0343-498d-9bca-196666c53830] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.523 183284 INFO nova.compute.manager [-] [instance: c4cf9774-0343-498d-9bca-196666c53830] Took 2.60 seconds to deallocate network for instance.
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.560 183284 DEBUG nova.compute.manager [req-bcaef682-07e8-40f9-8890-1f33fda3f6fb req-60a44f7d-5408-4e3d-8746-aef7a60b3e9c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Received event network-vif-deleted-2471f7dc-ce20-49f1-92be-a9f8b557ae8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.573 183284 DEBUG oslo_concurrency.lockutils [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.574 183284 DEBUG oslo_concurrency.lockutils [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.639 183284 DEBUG nova.compute.provider_tree [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.652 183284 DEBUG nova.scheduler.client.report [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.671 183284 DEBUG oslo_concurrency.lockutils [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.694 183284 INFO nova.scheduler.client.report [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Deleted allocations for instance c4cf9774-0343-498d-9bca-196666c53830
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.696 183284 DEBUG nova.compute.manager [req-93486c1c-1d0b-44bb-a77b-1e20727a47c2 req-6aee19e0-899a-4392-88a4-1a02d015d04d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Received event network-vif-plugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.697 183284 DEBUG oslo_concurrency.lockutils [req-93486c1c-1d0b-44bb-a77b-1e20727a47c2 req-6aee19e0-899a-4392-88a4-1a02d015d04d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c4cf9774-0343-498d-9bca-196666c53830-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.698 183284 DEBUG oslo_concurrency.lockutils [req-93486c1c-1d0b-44bb-a77b-1e20727a47c2 req-6aee19e0-899a-4392-88a4-1a02d015d04d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.698 183284 DEBUG oslo_concurrency.lockutils [req-93486c1c-1d0b-44bb-a77b-1e20727a47c2 req-6aee19e0-899a-4392-88a4-1a02d015d04d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.698 183284 DEBUG nova.compute.manager [req-93486c1c-1d0b-44bb-a77b-1e20727a47c2 req-6aee19e0-899a-4392-88a4-1a02d015d04d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] No waiting events found dispatching network-vif-plugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.698 183284 WARNING nova.compute.manager [req-93486c1c-1d0b-44bb-a77b-1e20727a47c2 req-6aee19e0-899a-4392-88a4-1a02d015d04d 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c4cf9774-0343-498d-9bca-196666c53830] Received unexpected event network-vif-plugged-2471f7dc-ce20-49f1-92be-a9f8b557ae8b for instance with vm_state deleted and task_state None.
Jan 21 18:15:01 compute-0 nova_compute[183278]: 2026-01-21 18:15:01.763 183284 DEBUG oslo_concurrency.lockutils [None req-39d0b9e5-801a-4264-8265-6065237f06dc 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "c4cf9774-0343-498d-9bca-196666c53830" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.127 183284 DEBUG oslo_concurrency.lockutils [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.127 183284 DEBUG oslo_concurrency.lockutils [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.128 183284 DEBUG oslo_concurrency.lockutils [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.128 183284 DEBUG oslo_concurrency.lockutils [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.128 183284 DEBUG oslo_concurrency.lockutils [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.129 183284 INFO nova.compute.manager [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Terminating instance
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.130 183284 DEBUG nova.compute.manager [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:15:03 compute-0 kernel: tap9cf7b098-bd (unregistering): left promiscuous mode
Jan 21 18:15:03 compute-0 NetworkManager[55506]: <info>  [1769019303.1580] device (tap9cf7b098-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:15:03 compute-0 ovn_controller[95419]: 2026-01-21T18:15:03Z|00055|binding|INFO|Releasing lport 9cf7b098-bd13-4dfa-8995-578571b027a3 from this chassis (sb_readonly=0)
Jan 21 18:15:03 compute-0 ovn_controller[95419]: 2026-01-21T18:15:03Z|00056|binding|INFO|Setting lport 9cf7b098-bd13-4dfa-8995-578571b027a3 down in Southbound
Jan 21 18:15:03 compute-0 ovn_controller[95419]: 2026-01-21T18:15:03Z|00057|binding|INFO|Removing iface tap9cf7b098-bd ovn-installed in OVS
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.165 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.172 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:14:07 10.100.0.6'], port_security=['fa:16:3e:bf:14:07 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8161199-7513-4099-89c4-00e7e075c92b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a4b7cdf556d4f8393d1c61b57628813', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c5ea7560-106a-40fd-a00a-355d8be6545e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=afee9644-a390-49fb-b346-3fd1c948feef, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=9cf7b098-bd13-4dfa-8995-578571b027a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.173 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 9cf7b098-bd13-4dfa-8995-578571b027a3 in datapath e8161199-7513-4099-89c4-00e7e075c92b unbound from our chassis
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.175 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8161199-7513-4099-89c4-00e7e075c92b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.176 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5b44fb66-2cd1-44ee-8413-5b4a4d0823ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.176 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b namespace which is not needed anymore
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.184 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:03 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 21 18:15:03 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Consumed 15.817s CPU time.
Jan 21 18:15:03 compute-0 systemd-machined[154592]: Machine qemu-3-instance-00000005 terminated.
Jan 21 18:15:03 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[204420]: [NOTICE]   (204424) : haproxy version is 2.8.14-c23fe91
Jan 21 18:15:03 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[204420]: [NOTICE]   (204424) : path to executable is /usr/sbin/haproxy
Jan 21 18:15:03 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[204420]: [WARNING]  (204424) : Exiting Master process...
Jan 21 18:15:03 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[204420]: [WARNING]  (204424) : Exiting Master process...
Jan 21 18:15:03 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[204420]: [ALERT]    (204424) : Current worker (204426) exited with code 143 (Terminated)
Jan 21 18:15:03 compute-0 neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b[204420]: [WARNING]  (204424) : All workers exited. Exiting... (0)
Jan 21 18:15:03 compute-0 systemd[1]: libpod-9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730.scope: Deactivated successfully.
Jan 21 18:15:03 compute-0 podman[204986]: 2026-01-21 18:15:03.304669244 +0000 UTC m=+0.042834389 container died 9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:15:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730-userdata-shm.mount: Deactivated successfully.
Jan 21 18:15:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-45ecf4c57b028f6f42299024e7748d826f023a4d0b4516648c3499cb0c3c19c6-merged.mount: Deactivated successfully.
Jan 21 18:15:03 compute-0 podman[204986]: 2026-01-21 18:15:03.340106364 +0000 UTC m=+0.078271469 container cleanup 9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 18:15:03 compute-0 systemd[1]: libpod-conmon-9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730.scope: Deactivated successfully.
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.384 183284 INFO nova.virt.libvirt.driver [-] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Instance destroyed successfully.
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.385 183284 DEBUG nova.objects.instance [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lazy-loading 'resources' on Instance uuid b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.400 183284 DEBUG nova.virt.libvirt.vif [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:14:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1134094695',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1134094695',id=5,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:14:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2a4b7cdf556d4f8393d1c61b57628813',ramdisk_id='',reservation_id='r-zgv0rt21',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-627352265',owner_user_name='tempest-TestExecuteActionsViaActuator-627352265-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:14:09Z,user_data=None,user_id='16f8ab2ae83b48f9a88753a5deddcc19',uuid=b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.401 183284 DEBUG nova.network.os_vif_util [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converting VIF {"id": "9cf7b098-bd13-4dfa-8995-578571b027a3", "address": "fa:16:3e:bf:14:07", "network": {"id": "e8161199-7513-4099-89c4-00e7e075c92b", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-514974975-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a4b7cdf556d4f8393d1c61b57628813", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cf7b098-bd", "ovs_interfaceid": "9cf7b098-bd13-4dfa-8995-578571b027a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.402 183284 DEBUG nova.network.os_vif_util [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:14:07,bridge_name='br-int',has_traffic_filtering=True,id=9cf7b098-bd13-4dfa-8995-578571b027a3,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cf7b098-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.402 183284 DEBUG os_vif [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:14:07,bridge_name='br-int',has_traffic_filtering=True,id=9cf7b098-bd13-4dfa-8995-578571b027a3,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cf7b098-bd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:15:03 compute-0 podman[205020]: 2026-01-21 18:15:03.402572689 +0000 UTC m=+0.041965529 container remove 9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.404 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.405 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf7b098-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.406 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.408 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.407 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa12c62-91a9-4c3d-a967-4fbd81980e45]: (4, ('Wed Jan 21 06:15:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b (9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730)\n9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730\nWed Jan 21 06:15:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b (9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730)\n9c615a6ac9004548d3719addf22293ef1086c1786f379fc3cdaf795323eaa730\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.410 183284 INFO os_vif [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:14:07,bridge_name='br-int',has_traffic_filtering=True,id=9cf7b098-bd13-4dfa-8995-578571b027a3,network=Network(e8161199-7513-4099-89c4-00e7e075c92b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cf7b098-bd')
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.411 183284 INFO nova.virt.libvirt.driver [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Deleting instance files /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9_del
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.411 183284 INFO nova.virt.libvirt.driver [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Deletion of /var/lib/nova/instances/b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9_del complete
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.411 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[79616432-40dd-4910-adc9-e4f9e2ce563c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.411 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8161199-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:15:03 compute-0 kernel: tape8161199-70: left promiscuous mode
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.415 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.425 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.427 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d38dfbbc-1386-4388-afcf-1126fe640af9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.446 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[6dfd89b7-eb8d-4b98-b155-570f3bd1a991]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.447 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0a1850-ae2f-4e7a-8c77-ccb1f18c4bbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.462 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[184f2a16-e386-47f0-a7e1-9e6f72ee71e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378315, 'reachable_time': 25327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205047, 'error': None, 'target': 'ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.463 183284 INFO nova.compute.manager [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.463 183284 DEBUG oslo.service.loopingcall [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.464 183284 DEBUG nova.compute.manager [-] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.464 183284 DEBUG nova.network.neutron [-] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:15:03 compute-0 systemd[1]: run-netns-ovnmeta\x2de8161199\x2d7513\x2d4099\x2d89c4\x2d00e7e075c92b.mount: Deactivated successfully.
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.465 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e8161199-7513-4099-89c4-00e7e075c92b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:15:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:03.466 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[67e5b310-4969-464d-b8b9-3ecdc9f6a0c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.665 183284 DEBUG nova.compute.manager [req-f2501f49-b610-4a7e-8a51-9d9683ee1e43 req-54645559-d600-4ef6-976e-ef83c20fabee 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Received event network-vif-unplugged-9cf7b098-bd13-4dfa-8995-578571b027a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.666 183284 DEBUG oslo_concurrency.lockutils [req-f2501f49-b610-4a7e-8a51-9d9683ee1e43 req-54645559-d600-4ef6-976e-ef83c20fabee 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.667 183284 DEBUG oslo_concurrency.lockutils [req-f2501f49-b610-4a7e-8a51-9d9683ee1e43 req-54645559-d600-4ef6-976e-ef83c20fabee 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.667 183284 DEBUG oslo_concurrency.lockutils [req-f2501f49-b610-4a7e-8a51-9d9683ee1e43 req-54645559-d600-4ef6-976e-ef83c20fabee 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.668 183284 DEBUG nova.compute.manager [req-f2501f49-b610-4a7e-8a51-9d9683ee1e43 req-54645559-d600-4ef6-976e-ef83c20fabee 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] No waiting events found dispatching network-vif-unplugged-9cf7b098-bd13-4dfa-8995-578571b027a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:15:03 compute-0 nova_compute[183278]: 2026-01-21 18:15:03.668 183284 DEBUG nova.compute.manager [req-f2501f49-b610-4a7e-8a51-9d9683ee1e43 req-54645559-d600-4ef6-976e-ef83c20fabee 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Received event network-vif-unplugged-9cf7b098-bd13-4dfa-8995-578571b027a3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.053 183284 DEBUG nova.network.neutron [-] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.072 183284 INFO nova.compute.manager [-] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Took 0.61 seconds to deallocate network for instance.
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.109 183284 DEBUG oslo_concurrency.lockutils [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.109 183284 DEBUG oslo_concurrency.lockutils [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.155 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.274 183284 DEBUG nova.compute.provider_tree [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.287 183284 DEBUG nova.scheduler.client.report [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.307 183284 DEBUG oslo_concurrency.lockutils [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.372 183284 INFO nova.scheduler.client.report [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Deleted allocations for instance b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.426 183284 DEBUG oslo_concurrency.lockutils [None req-9a843f3f-e330-40e5-9d43-f70043ae4463 16f8ab2ae83b48f9a88753a5deddcc19 2a4b7cdf556d4f8393d1c61b57628813 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.440 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769019289.4385655, b499883a-ee9f-4239-b996-4fbaa175bcc3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.441 183284 INFO nova.compute.manager [-] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] VM Stopped (Lifecycle Event)
Jan 21 18:15:04 compute-0 nova_compute[183278]: 2026-01-21 18:15:04.455 183284 DEBUG nova.compute.manager [None req-87c94ff8-c95a-4647-9036-c535db5a3adb - - - - - -] [instance: b499883a-ee9f-4239-b996-4fbaa175bcc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:15:05 compute-0 podman[205048]: 2026-01-21 18:15:05.004544389 +0000 UTC m=+0.063691215 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, version=9.6, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7)
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.789 183284 DEBUG nova.compute.manager [req-47376097-dd65-406e-ba56-09b5d8e3a125 req-18ca9cf9-ec00-4c0a-831e-2ddead296018 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Received event network-vif-plugged-9cf7b098-bd13-4dfa-8995-578571b027a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.790 183284 DEBUG oslo_concurrency.lockutils [req-47376097-dd65-406e-ba56-09b5d8e3a125 req-18ca9cf9-ec00-4c0a-831e-2ddead296018 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.790 183284 DEBUG oslo_concurrency.lockutils [req-47376097-dd65-406e-ba56-09b5d8e3a125 req-18ca9cf9-ec00-4c0a-831e-2ddead296018 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.790 183284 DEBUG oslo_concurrency.lockutils [req-47376097-dd65-406e-ba56-09b5d8e3a125 req-18ca9cf9-ec00-4c0a-831e-2ddead296018 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.790 183284 DEBUG nova.compute.manager [req-47376097-dd65-406e-ba56-09b5d8e3a125 req-18ca9cf9-ec00-4c0a-831e-2ddead296018 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] No waiting events found dispatching network-vif-plugged-9cf7b098-bd13-4dfa-8995-578571b027a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.790 183284 WARNING nova.compute.manager [req-47376097-dd65-406e-ba56-09b5d8e3a125 req-18ca9cf9-ec00-4c0a-831e-2ddead296018 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Received unexpected event network-vif-plugged-9cf7b098-bd13-4dfa-8995-578571b027a3 for instance with vm_state deleted and task_state None.
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.791 183284 DEBUG nova.compute.manager [req-47376097-dd65-406e-ba56-09b5d8e3a125 req-18ca9cf9-ec00-4c0a-831e-2ddead296018 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Received event network-vif-deleted-9cf7b098-bd13-4dfa-8995-578571b027a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:15:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:05.885 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:15:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:05.886 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.905 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.921 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.922 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.922 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.936 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:15:05 compute-0 nova_compute[183278]: 2026-01-21 18:15:05.937 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:06 compute-0 nova_compute[183278]: 2026-01-21 18:15:06.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:07 compute-0 nova_compute[183278]: 2026-01-21 18:15:07.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.013 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.014 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.014 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.014 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.148 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.150 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5868MB free_disk=73.38360595703125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.150 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.151 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.215 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.215 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.234 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.249 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.275 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.276 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:08 compute-0 nova_compute[183278]: 2026-01-21 18:15:08.438 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:09 compute-0 nova_compute[183278]: 2026-01-21 18:15:09.156 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:09 compute-0 nova_compute[183278]: 2026-01-21 18:15:09.276 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:09 compute-0 nova_compute[183278]: 2026-01-21 18:15:09.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:09 compute-0 nova_compute[183278]: 2026-01-21 18:15:09.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:09 compute-0 nova_compute[183278]: 2026-01-21 18:15:09.830 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:09 compute-0 nova_compute[183278]: 2026-01-21 18:15:09.830 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:15:10 compute-0 nova_compute[183278]: 2026-01-21 18:15:10.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:10 compute-0 nova_compute[183278]: 2026-01-21 18:15:10.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:12 compute-0 podman[205071]: 2026-01-21 18:15:12.006626732 +0000 UTC m=+0.057020913 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 18:15:12 compute-0 nova_compute[183278]: 2026-01-21 18:15:12.021 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:12 compute-0 podman[205070]: 2026-01-21 18:15:12.032618503 +0000 UTC m=+0.087420182 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:15:13 compute-0 nova_compute[183278]: 2026-01-21 18:15:13.439 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:13 compute-0 nova_compute[183278]: 2026-01-21 18:15:13.780 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769019298.7791178, c4cf9774-0343-498d-9bca-196666c53830 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:15:13 compute-0 nova_compute[183278]: 2026-01-21 18:15:13.780 183284 INFO nova.compute.manager [-] [instance: c4cf9774-0343-498d-9bca-196666c53830] VM Stopped (Lifecycle Event)
Jan 21 18:15:13 compute-0 nova_compute[183278]: 2026-01-21 18:15:13.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:13 compute-0 nova_compute[183278]: 2026-01-21 18:15:13.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 18:15:14 compute-0 nova_compute[183278]: 2026-01-21 18:15:14.158 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:14 compute-0 nova_compute[183278]: 2026-01-21 18:15:14.996 183284 DEBUG nova.compute.manager [None req-f8056af0-aa8c-40e1-abdd-fcd7e4d9d64a - - - - - -] [instance: c4cf9774-0343-498d-9bca-196666c53830] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:15:15 compute-0 nova_compute[183278]: 2026-01-21 18:15:15.038 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 18:15:15 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:15.888 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:15:15 compute-0 podman[205111]: 2026-01-21 18:15:15.99343028 +0000 UTC m=+0.050442424 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:15:18 compute-0 nova_compute[183278]: 2026-01-21 18:15:18.383 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769019303.3826637, b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:15:18 compute-0 nova_compute[183278]: 2026-01-21 18:15:18.384 183284 INFO nova.compute.manager [-] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] VM Stopped (Lifecycle Event)
Jan 21 18:15:18 compute-0 nova_compute[183278]: 2026-01-21 18:15:18.491 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:18 compute-0 nova_compute[183278]: 2026-01-21 18:15:18.816 183284 DEBUG nova.compute.manager [None req-83759556-e7e9-4c08-afa2-0b2c0ecfe1fa - - - - - -] [instance: b4d5a5d0-3b44-4630-9b1d-dd50123c2dc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:15:19 compute-0 nova_compute[183278]: 2026-01-21 18:15:19.160 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:20.071 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:15:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:20.072 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:15:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:15:20.072 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:15:23 compute-0 nova_compute[183278]: 2026-01-21 18:15:23.494 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:24 compute-0 nova_compute[183278]: 2026-01-21 18:15:24.161 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:28 compute-0 nova_compute[183278]: 2026-01-21 18:15:28.497 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:29 compute-0 nova_compute[183278]: 2026-01-21 18:15:29.163 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:29 compute-0 podman[192560]: time="2026-01-21T18:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:15:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:15:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Jan 21 18:15:29 compute-0 nova_compute[183278]: 2026-01-21 18:15:29.929 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:15:31 compute-0 openstack_network_exporter[195402]: ERROR   18:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:15:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:15:31 compute-0 openstack_network_exporter[195402]: ERROR   18:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:15:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:15:32 compute-0 nova_compute[183278]: 2026-01-21 18:15:32.041 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:33 compute-0 nova_compute[183278]: 2026-01-21 18:15:33.500 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:34 compute-0 nova_compute[183278]: 2026-01-21 18:15:34.164 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:35 compute-0 podman[205135]: 2026-01-21 18:15:35.998755068 +0000 UTC m=+0.057683050 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 21 18:15:38 compute-0 nova_compute[183278]: 2026-01-21 18:15:38.504 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:39 compute-0 nova_compute[183278]: 2026-01-21 18:15:39.165 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:42 compute-0 podman[205157]: 2026-01-21 18:15:42.995281118 +0000 UTC m=+0.045924415 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 18:15:43 compute-0 podman[205156]: 2026-01-21 18:15:43.048338975 +0000 UTC m=+0.104456615 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 21 18:15:43 compute-0 nova_compute[183278]: 2026-01-21 18:15:43.505 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:44 compute-0 nova_compute[183278]: 2026-01-21 18:15:44.167 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:46 compute-0 podman[205199]: 2026-01-21 18:15:46.993363518 +0000 UTC m=+0.053959208 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:15:48 compute-0 nova_compute[183278]: 2026-01-21 18:15:48.509 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:49 compute-0 nova_compute[183278]: 2026-01-21 18:15:49.168 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:53 compute-0 nova_compute[183278]: 2026-01-21 18:15:53.513 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:54 compute-0 nova_compute[183278]: 2026-01-21 18:15:54.169 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:54 compute-0 sshd-session[205224]: Invalid user ansible_user from 64.227.98.100 port 55978
Jan 21 18:15:54 compute-0 sshd-session[205224]: Connection closed by invalid user ansible_user 64.227.98.100 port 55978 [preauth]
Jan 21 18:15:58 compute-0 nova_compute[183278]: 2026-01-21 18:15:58.516 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:59 compute-0 nova_compute[183278]: 2026-01-21 18:15:59.170 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:15:59 compute-0 podman[192560]: time="2026-01-21T18:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:15:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:15:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 21 18:16:01 compute-0 openstack_network_exporter[195402]: ERROR   18:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:16:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:16:01 compute-0 openstack_network_exporter[195402]: ERROR   18:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:16:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:16:03 compute-0 nova_compute[183278]: 2026-01-21 18:16:03.519 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.000 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.001 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.017 183284 DEBUG nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.105 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.106 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.112 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.113 183284 INFO nova.compute.claims [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.171 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.235 183284 DEBUG nova.compute.provider_tree [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.249 183284 DEBUG nova.scheduler.client.report [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.269 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.269 183284 DEBUG nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.320 183284 DEBUG nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.320 183284 DEBUG nova.network.neutron [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.351 183284 INFO nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.368 183284 DEBUG nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.469 183284 DEBUG nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.470 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.471 183284 INFO nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Creating image(s)
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.471 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Acquiring lock "/var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.471 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "/var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.472 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "/var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.485 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.538 183284 DEBUG nova.policy [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c22533b2094d465a9fc14ed57c562c02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aeb82e0980254fc885bc0eaa70c4cc68', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.542 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.543 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.543 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.557 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.613 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.614 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.681 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk 1073741824" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.682 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.683 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.737 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.738 183284 DEBUG nova.virt.disk.api [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Checking if we can resize image /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.738 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.795 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.796 183284 DEBUG nova.virt.disk.api [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Cannot resize image /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.797 183284 DEBUG nova.objects.instance [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lazy-loading 'migration_context' on Instance uuid b12214ad-ceca-4678-87c7-b9f991d6959e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.820 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.821 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Ensure instance console log exists: /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.821 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.821 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:04 compute-0 nova_compute[183278]: 2026-01-21 18:16:04.822 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:05 compute-0 nova_compute[183278]: 2026-01-21 18:16:05.515 183284 DEBUG nova.network.neutron [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Successfully created port: d068a9ac-1495-43a0-9d00-8867b0e13f03 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.116 183284 DEBUG nova.network.neutron [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Successfully updated port: d068a9ac-1495-43a0-9d00-8867b0e13f03 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.130 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Acquiring lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.130 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Acquired lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.130 183284 DEBUG nova.network.neutron [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.221 183284 DEBUG nova.compute.manager [req-2e2b09c8-1a18-43af-9838-f9a23e9c4bb8 req-d70cc4ea-6e8e-429a-8fb2-f87514ce33a9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-changed-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.222 183284 DEBUG nova.compute.manager [req-2e2b09c8-1a18-43af-9838-f9a23e9c4bb8 req-d70cc4ea-6e8e-429a-8fb2-f87514ce33a9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Refreshing instance network info cache due to event network-changed-d068a9ac-1495-43a0-9d00-8867b0e13f03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.222 183284 DEBUG oslo_concurrency.lockutils [req-2e2b09c8-1a18-43af-9838-f9a23e9c4bb8 req-d70cc4ea-6e8e-429a-8fb2-f87514ce33a9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.271 183284 DEBUG nova.network.neutron [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.837 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.837 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.838 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.853 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.853 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:16:06 compute-0 nova_compute[183278]: 2026-01-21 18:16:06.854 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:16:06 compute-0 ovn_controller[95419]: 2026-01-21T18:16:06Z|00058|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 18:16:07 compute-0 podman[205243]: 2026-01-21 18:16:07.001463864 +0000 UTC m=+0.057233468 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 18:16:07 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:16:07 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:16:07 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.845 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:07 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.845 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:07 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.845 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:07 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.845 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:16:07 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.998 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:16:07 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.999 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5877MB free_disk=73.38341522216797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:16:07 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.999 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:08 compute-0 nova_compute[183278]: 2026-01-21 18:16:07.999 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:08 compute-0 nova_compute[183278]: 2026-01-21 18:16:08.109 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance b12214ad-ceca-4678-87c7-b9f991d6959e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:16:08 compute-0 nova_compute[183278]: 2026-01-21 18:16:08.109 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:16:08 compute-0 nova_compute[183278]: 2026-01-21 18:16:08.110 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:16:08 compute-0 nova_compute[183278]: 2026-01-21 18:16:08.146 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:16:08 compute-0 nova_compute[183278]: 2026-01-21 18:16:08.161 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:16:08 compute-0 nova_compute[183278]: 2026-01-21 18:16:08.182 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:16:08 compute-0 nova_compute[183278]: 2026-01-21 18:16:08.182 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:08 compute-0 nova_compute[183278]: 2026-01-21 18:16:08.626 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.173 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.182 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.510 183284 DEBUG nova.network.neutron [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updating instance_info_cache with network_info: [{"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.536 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Releasing lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.536 183284 DEBUG nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Instance network_info: |[{"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.537 183284 DEBUG oslo_concurrency.lockutils [req-2e2b09c8-1a18-43af-9838-f9a23e9c4bb8 req-d70cc4ea-6e8e-429a-8fb2-f87514ce33a9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.537 183284 DEBUG nova.network.neutron [req-2e2b09c8-1a18-43af-9838-f9a23e9c4bb8 req-d70cc4ea-6e8e-429a-8fb2-f87514ce33a9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Refreshing network info cache for port d068a9ac-1495-43a0-9d00-8867b0e13f03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.540 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Start _get_guest_xml network_info=[{"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.545 183284 WARNING nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.551 183284 DEBUG nova.virt.libvirt.host [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.552 183284 DEBUG nova.virt.libvirt.host [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.559 183284 DEBUG nova.virt.libvirt.host [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.559 183284 DEBUG nova.virt.libvirt.host [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.560 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.561 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.561 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.561 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.561 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.561 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.562 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.562 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.562 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.562 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.562 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.562 183284 DEBUG nova.virt.hardware [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.566 183284 DEBUG nova.virt.libvirt.vif [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:16:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-81450804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-81450804',id=7,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aeb82e0980254fc885bc0eaa70c4cc68',ramdisk_id='',reservation_id='r-kywgsf09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-477892887',owner_user_name='tempest-TestExecuteBasicStrategy-477892887-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:16:04Z,user_data=None,user_id='c22533b2094d465a9fc14ed57c562c02',uuid=b12214ad-ceca-4678-87c7-b9f991d6959e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.566 183284 DEBUG nova.network.os_vif_util [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Converting VIF {"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.567 183284 DEBUG nova.network.os_vif_util [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:8d:04,bridge_name='br-int',has_traffic_filtering=True,id=d068a9ac-1495-43a0-9d00-8867b0e13f03,network=Network(b9000416-b201-46fa-8852-f77e5f1d4714),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd068a9ac-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.568 183284 DEBUG nova.objects.instance [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lazy-loading 'pci_devices' on Instance uuid b12214ad-ceca-4678-87c7-b9f991d6959e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.582 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <uuid>b12214ad-ceca-4678-87c7-b9f991d6959e</uuid>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <name>instance-00000007</name>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteBasicStrategy-server-81450804</nova:name>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:16:09</nova:creationTime>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:16:09 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:16:09 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:16:09 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:16:09 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:16:09 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:16:09 compute-0 nova_compute[183278]:         <nova:user uuid="c22533b2094d465a9fc14ed57c562c02">tempest-TestExecuteBasicStrategy-477892887-project-member</nova:user>
Jan 21 18:16:09 compute-0 nova_compute[183278]:         <nova:project uuid="aeb82e0980254fc885bc0eaa70c4cc68">tempest-TestExecuteBasicStrategy-477892887</nova:project>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:16:09 compute-0 nova_compute[183278]:         <nova:port uuid="d068a9ac-1495-43a0-9d00-8867b0e13f03">
Jan 21 18:16:09 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <system>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <entry name="serial">b12214ad-ceca-4678-87c7-b9f991d6959e</entry>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <entry name="uuid">b12214ad-ceca-4678-87c7-b9f991d6959e</entry>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     </system>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <os>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   </os>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <features>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   </features>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk.config"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:34:8d:04"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <target dev="tapd068a9ac-14"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/console.log" append="off"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <video>
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     </video>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:16:09 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:16:09 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:16:09 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:16:09 compute-0 nova_compute[183278]: </domain>
Jan 21 18:16:09 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.582 183284 DEBUG nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Preparing to wait for external event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.583 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.583 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.583 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.584 183284 DEBUG nova.virt.libvirt.vif [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:16:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-81450804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-81450804',id=7,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aeb82e0980254fc885bc0eaa70c4cc68',ramdisk_id='',reservation_id='r-kywgsf09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-477892887',owner_user_name='tempest-TestExecuteBasicStrategy-477892887-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:16:04Z,user_data=None,user_id='c22533b2094d465a9fc14ed57c562c02',uuid=b12214ad-ceca-4678-87c7-b9f991d6959e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.584 183284 DEBUG nova.network.os_vif_util [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Converting VIF {"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.585 183284 DEBUG nova.network.os_vif_util [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:8d:04,bridge_name='br-int',has_traffic_filtering=True,id=d068a9ac-1495-43a0-9d00-8867b0e13f03,network=Network(b9000416-b201-46fa-8852-f77e5f1d4714),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd068a9ac-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.585 183284 DEBUG os_vif [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:8d:04,bridge_name='br-int',has_traffic_filtering=True,id=d068a9ac-1495-43a0-9d00-8867b0e13f03,network=Network(b9000416-b201-46fa-8852-f77e5f1d4714),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd068a9ac-14') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.585 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.586 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.586 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.589 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.589 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd068a9ac-14, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.590 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd068a9ac-14, col_values=(('external_ids', {'iface-id': 'd068a9ac-1495-43a0-9d00-8867b0e13f03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:8d:04', 'vm-uuid': 'b12214ad-ceca-4678-87c7-b9f991d6959e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.592 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:09 compute-0 NetworkManager[55506]: <info>  [1769019369.5930] manager: (tapd068a9ac-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.595 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.597 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.598 183284 INFO os_vif [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:8d:04,bridge_name='br-int',has_traffic_filtering=True,id=d068a9ac-1495-43a0-9d00-8867b0e13f03,network=Network(b9000416-b201-46fa-8852-f77e5f1d4714),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd068a9ac-14')
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.827 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.827 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.827 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] No VIF found with MAC fa:16:3e:34:8d:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:16:09 compute-0 nova_compute[183278]: 2026-01-21 18:16:09.828 183284 INFO nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Using config drive
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.621 183284 INFO nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Creating config drive at /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk.config
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.626 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi1wy2jl1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.748 183284 DEBUG oslo_concurrency.processutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi1wy2jl1" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:16:10 compute-0 kernel: tapd068a9ac-14: entered promiscuous mode
Jan 21 18:16:10 compute-0 NetworkManager[55506]: <info>  [1769019370.7957] manager: (tapd068a9ac-14): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Jan 21 18:16:10 compute-0 ovn_controller[95419]: 2026-01-21T18:16:10Z|00059|binding|INFO|Claiming lport d068a9ac-1495-43a0-9d00-8867b0e13f03 for this chassis.
Jan 21 18:16:10 compute-0 ovn_controller[95419]: 2026-01-21T18:16:10Z|00060|binding|INFO|d068a9ac-1495-43a0-9d00-8867b0e13f03: Claiming fa:16:3e:34:8d:04 10.100.0.12
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.796 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.800 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.803 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.813 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:8d:04 10.100.0.12'], port_security=['fa:16:3e:34:8d:04 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b12214ad-ceca-4678-87c7-b9f991d6959e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9000416-b201-46fa-8852-f77e5f1d4714', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aeb82e0980254fc885bc0eaa70c4cc68', 'neutron:revision_number': '2', 'neutron:security_group_ids': '823f317c-7d21-4ee3-849b-1072420150d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a49360a-b626-4349-8811-d78c04e8937e, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=d068a9ac-1495-43a0-9d00-8867b0e13f03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.814 104698 INFO neutron.agent.ovn.metadata.agent [-] Port d068a9ac-1495-43a0-9d00-8867b0e13f03 in datapath b9000416-b201-46fa-8852-f77e5f1d4714 bound to our chassis
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.815 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b9000416-b201-46fa-8852-f77e5f1d4714
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.825 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[fa238405-6df7-4a11-9a13-4dabe71504ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.826 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb9000416-b1 in ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:16:10 compute-0 systemd-machined[154592]: New machine qemu-5-instance-00000007.
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.828 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb9000416-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.828 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d18933e8-44de-4888-acc7-e6bb67c45b32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.829 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[6e1c7cfc-1e98-4e6c-a48e-03d9e48ef174]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.841 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c64293-bbf0-4f1b-9b4d-f4fd1528a421]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000007.
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.856 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[02f6e48f-485d-4960-8c80-51c6dd8a0957]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.860 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:10 compute-0 ovn_controller[95419]: 2026-01-21T18:16:10Z|00061|binding|INFO|Setting lport d068a9ac-1495-43a0-9d00-8867b0e13f03 ovn-installed in OVS
Jan 21 18:16:10 compute-0 ovn_controller[95419]: 2026-01-21T18:16:10Z|00062|binding|INFO|Setting lport d068a9ac-1495-43a0-9d00-8867b0e13f03 up in Southbound
Jan 21 18:16:10 compute-0 nova_compute[183278]: 2026-01-21 18:16:10.863 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:10 compute-0 systemd-udevd[205288]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:16:10 compute-0 NetworkManager[55506]: <info>  [1769019370.8794] device (tapd068a9ac-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:16:10 compute-0 NetworkManager[55506]: <info>  [1769019370.8801] device (tapd068a9ac-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.887 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[25f75778-a86d-4f01-bd26-a986ccf84491]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.891 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f3fc25-ad86-47bf-9910-8d5172c395a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 NetworkManager[55506]: <info>  [1769019370.8924] manager: (tapb9000416-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.918 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[5afa4f14-2c50-4438-a7f0-fb5333bd43dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.921 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[79e93efb-ce0b-4da5-af35-060701e314ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 NetworkManager[55506]: <info>  [1769019370.9444] device (tapb9000416-b0): carrier: link connected
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.950 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[b279d6ed-3ee7-4dca-b044-106c50425d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.973 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[60b11c86-8c52-4e0a-a1f0-4c441fd15f04]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9000416-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:6f:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393455, 'reachable_time': 17492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205317, 'error': None, 'target': 'ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:10 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:10.988 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4f25f8bc-40cf-439b-bcc6-2f45da68a34c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe39:6fa9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 393455, 'tstamp': 393455}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 205318, 'error': None, 'target': 'ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.002 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4e663d57-0fbb-4d7d-9e47-c3ff2fb5965a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb9000416-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:39:6f:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393455, 'reachable_time': 17492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 205319, 'error': None, 'target': 'ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.030 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9562e9-8600-4f9e-ac8b-df6bde896c23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.084 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbd347d-a2e4-43b8-8743-c680a075ad2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.086 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9000416-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.086 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.086 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9000416-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.088 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:11 compute-0 kernel: tapb9000416-b0: entered promiscuous mode
Jan 21 18:16:11 compute-0 NetworkManager[55506]: <info>  [1769019371.0891] manager: (tapb9000416-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.090 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.095 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb9000416-b0, col_values=(('external_ids', {'iface-id': 'ca79930b-5c55-4671-8350-d51db8c90360'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.096 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:11 compute-0 ovn_controller[95419]: 2026-01-21T18:16:11Z|00063|binding|INFO|Releasing lport ca79930b-5c55-4671-8350-d51db8c90360 from this chassis (sb_readonly=0)
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.100 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b9000416-b201-46fa-8852-f77e5f1d4714.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b9000416-b201-46fa-8852-f77e5f1d4714.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.101 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[21284dd6-8693-4dcf-8e7a-dea84156c076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.101 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-b9000416-b201-46fa-8852-f77e5f1d4714
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/b9000416-b201-46fa-8852-f77e5f1d4714.pid.haproxy
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID b9000416-b201-46fa-8852-f77e5f1d4714
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:16:11 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:11.102 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714', 'env', 'PROCESS_TAG=haproxy-b9000416-b201-46fa-8852-f77e5f1d4714', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b9000416-b201-46fa-8852-f77e5f1d4714.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.109 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.485 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019371.484992, b12214ad-ceca-4678-87c7-b9f991d6959e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.486 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] VM Started (Lifecycle Event)
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.511 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.516 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019371.487358, b12214ad-ceca-4678-87c7-b9f991d6959e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.516 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] VM Paused (Lifecycle Event)
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.534 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.537 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:16:11 compute-0 podman[205357]: 2026-01-21 18:16:11.545521411 +0000 UTC m=+0.088693297 container create d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.559 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:16:11 compute-0 podman[205357]: 2026-01-21 18:16:11.47937753 +0000 UTC m=+0.022549436 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:16:11 compute-0 systemd[1]: Started libpod-conmon-d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9.scope.
Jan 21 18:16:11 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:16:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83f23d12cef94d6e22833c0cb8e91c0e550bbf9140b2df35e0707f1d315c39e7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:16:11 compute-0 podman[205357]: 2026-01-21 18:16:11.634518123 +0000 UTC m=+0.177690029 container init d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 18:16:11 compute-0 podman[205357]: 2026-01-21 18:16:11.639783371 +0000 UTC m=+0.182955257 container start d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 21 18:16:11 compute-0 neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714[205373]: [NOTICE]   (205377) : New worker (205379) forked
Jan 21 18:16:11 compute-0 neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714[205373]: [NOTICE]   (205377) : Loading success.
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.665 183284 DEBUG nova.compute.manager [req-afab15b2-1012-4488-8708-fd40f807d479 req-3f811e11-7739-4b78-9152-fda5f72386a4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.666 183284 DEBUG oslo_concurrency.lockutils [req-afab15b2-1012-4488-8708-fd40f807d479 req-3f811e11-7739-4b78-9152-fda5f72386a4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.667 183284 DEBUG oslo_concurrency.lockutils [req-afab15b2-1012-4488-8708-fd40f807d479 req-3f811e11-7739-4b78-9152-fda5f72386a4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.667 183284 DEBUG oslo_concurrency.lockutils [req-afab15b2-1012-4488-8708-fd40f807d479 req-3f811e11-7739-4b78-9152-fda5f72386a4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.667 183284 DEBUG nova.compute.manager [req-afab15b2-1012-4488-8708-fd40f807d479 req-3f811e11-7739-4b78-9152-fda5f72386a4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Processing event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.668 183284 DEBUG nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.671 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019371.671717, b12214ad-ceca-4678-87c7-b9f991d6959e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.672 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] VM Resumed (Lifecycle Event)
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.673 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.677 183284 INFO nova.virt.libvirt.driver [-] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Instance spawned successfully.
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.678 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.701 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.707 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.710 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.710 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.711 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.711 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.711 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.712 183284 DEBUG nova.virt.libvirt.driver [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.745 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.785 183284 INFO nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Took 7.32 seconds to spawn the instance on the hypervisor.
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.786 183284 DEBUG nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.848 183284 INFO nova.compute.manager [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Took 7.77 seconds to build instance.
Jan 21 18:16:11 compute-0 nova_compute[183278]: 2026-01-21 18:16:11.864 183284 DEBUG oslo_concurrency.lockutils [None req-8fc3302a-8edb-430c-b60a-9a3bcf0b72bd c22533b2094d465a9fc14ed57c562c02 aeb82e0980254fc885bc0eaa70c4cc68 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:12 compute-0 nova_compute[183278]: 2026-01-21 18:16:12.618 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:12 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:12.618 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:16:12 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:12.619 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:16:12 compute-0 nova_compute[183278]: 2026-01-21 18:16:12.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:16:12 compute-0 nova_compute[183278]: 2026-01-21 18:16:12.824 183284 DEBUG nova.network.neutron [req-2e2b09c8-1a18-43af-9838-f9a23e9c4bb8 req-d70cc4ea-6e8e-429a-8fb2-f87514ce33a9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updated VIF entry in instance network info cache for port d068a9ac-1495-43a0-9d00-8867b0e13f03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:16:12 compute-0 nova_compute[183278]: 2026-01-21 18:16:12.824 183284 DEBUG nova.network.neutron [req-2e2b09c8-1a18-43af-9838-f9a23e9c4bb8 req-d70cc4ea-6e8e-429a-8fb2-f87514ce33a9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updating instance_info_cache with network_info: [{"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:16:12 compute-0 nova_compute[183278]: 2026-01-21 18:16:12.839 183284 DEBUG oslo_concurrency.lockutils [req-2e2b09c8-1a18-43af-9838-f9a23e9c4bb8 req-d70cc4ea-6e8e-429a-8fb2-f87514ce33a9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:16:13 compute-0 nova_compute[183278]: 2026-01-21 18:16:13.835 183284 DEBUG nova.compute.manager [req-99137c1d-37db-4f75-87dd-830e1c0cbd3f req-b23b9c17-036a-4567-9596-d524c896c026 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:16:13 compute-0 nova_compute[183278]: 2026-01-21 18:16:13.836 183284 DEBUG oslo_concurrency.lockutils [req-99137c1d-37db-4f75-87dd-830e1c0cbd3f req-b23b9c17-036a-4567-9596-d524c896c026 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:13 compute-0 nova_compute[183278]: 2026-01-21 18:16:13.836 183284 DEBUG oslo_concurrency.lockutils [req-99137c1d-37db-4f75-87dd-830e1c0cbd3f req-b23b9c17-036a-4567-9596-d524c896c026 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:13 compute-0 nova_compute[183278]: 2026-01-21 18:16:13.836 183284 DEBUG oslo_concurrency.lockutils [req-99137c1d-37db-4f75-87dd-830e1c0cbd3f req-b23b9c17-036a-4567-9596-d524c896c026 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:13 compute-0 nova_compute[183278]: 2026-01-21 18:16:13.837 183284 DEBUG nova.compute.manager [req-99137c1d-37db-4f75-87dd-830e1c0cbd3f req-b23b9c17-036a-4567-9596-d524c896c026 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] No waiting events found dispatching network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:16:13 compute-0 nova_compute[183278]: 2026-01-21 18:16:13.837 183284 WARNING nova.compute.manager [req-99137c1d-37db-4f75-87dd-830e1c0cbd3f req-b23b9c17-036a-4567-9596-d524c896c026 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received unexpected event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 for instance with vm_state active and task_state None.
Jan 21 18:16:13 compute-0 podman[205389]: 2026-01-21 18:16:13.999616212 +0000 UTC m=+0.050361490 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 18:16:14 compute-0 podman[205388]: 2026-01-21 18:16:14.02723964 +0000 UTC m=+0.078414399 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 21 18:16:14 compute-0 nova_compute[183278]: 2026-01-21 18:16:14.204 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:14 compute-0 nova_compute[183278]: 2026-01-21 18:16:14.592 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:15 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:15.621 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:16:17 compute-0 podman[205434]: 2026-01-21 18:16:17.995003056 +0000 UTC m=+0.053744821 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:16:19 compute-0 nova_compute[183278]: 2026-01-21 18:16:19.248 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:19 compute-0 nova_compute[183278]: 2026-01-21 18:16:19.594 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:20.073 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:16:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:20.073 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:16:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:16:20.074 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:16:24 compute-0 nova_compute[183278]: 2026-01-21 18:16:24.251 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:24 compute-0 nova_compute[183278]: 2026-01-21 18:16:24.597 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:27 compute-0 sshd-session[205473]: Connection closed by authenticating user root 45.148.10.121 port 51890 [preauth]
Jan 21 18:16:27 compute-0 ovn_controller[95419]: 2026-01-21T18:16:27Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:34:8d:04 10.100.0.12
Jan 21 18:16:27 compute-0 ovn_controller[95419]: 2026-01-21T18:16:27Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:34:8d:04 10.100.0.12
Jan 21 18:16:29 compute-0 nova_compute[183278]: 2026-01-21 18:16:29.309 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:29 compute-0 nova_compute[183278]: 2026-01-21 18:16:29.599 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:29 compute-0 podman[192560]: time="2026-01-21T18:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:16:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:16:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Jan 21 18:16:31 compute-0 openstack_network_exporter[195402]: ERROR   18:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:16:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:16:31 compute-0 openstack_network_exporter[195402]: ERROR   18:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:16:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:16:34 compute-0 nova_compute[183278]: 2026-01-21 18:16:34.312 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:34 compute-0 nova_compute[183278]: 2026-01-21 18:16:34.601 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:38 compute-0 podman[205475]: 2026-01-21 18:16:38.001387244 +0000 UTC m=+0.055152405 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, config_id=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:16:39 compute-0 nova_compute[183278]: 2026-01-21 18:16:39.313 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:39 compute-0 nova_compute[183278]: 2026-01-21 18:16:39.603 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:44 compute-0 nova_compute[183278]: 2026-01-21 18:16:44.316 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:44 compute-0 podman[205497]: 2026-01-21 18:16:44.420788381 +0000 UTC m=+0.078212983 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 18:16:44 compute-0 podman[205498]: 2026-01-21 18:16:44.420802412 +0000 UTC m=+0.074573945 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:16:44 compute-0 nova_compute[183278]: 2026-01-21 18:16:44.604 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:48 compute-0 podman[205544]: 2026-01-21 18:16:48.992198635 +0000 UTC m=+0.052888171 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:16:49 compute-0 nova_compute[183278]: 2026-01-21 18:16:49.318 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:49 compute-0 nova_compute[183278]: 2026-01-21 18:16:49.606 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:49 compute-0 ovn_controller[95419]: 2026-01-21T18:16:49Z|00064|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 21 18:16:54 compute-0 nova_compute[183278]: 2026-01-21 18:16:54.319 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:54 compute-0 nova_compute[183278]: 2026-01-21 18:16:54.607 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:59 compute-0 nova_compute[183278]: 2026-01-21 18:16:59.321 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:59 compute-0 nova_compute[183278]: 2026-01-21 18:16:59.652 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:16:59 compute-0 podman[192560]: time="2026-01-21T18:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:16:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:16:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Jan 21 18:17:01 compute-0 openstack_network_exporter[195402]: ERROR   18:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:17:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:17:01 compute-0 openstack_network_exporter[195402]: ERROR   18:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:17:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:17:04 compute-0 nova_compute[183278]: 2026-01-21 18:17:04.325 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:04 compute-0 nova_compute[183278]: 2026-01-21 18:17:04.655 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:06 compute-0 nova_compute[183278]: 2026-01-21 18:17:06.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:06 compute-0 nova_compute[183278]: 2026-01-21 18:17:06.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:17:06 compute-0 nova_compute[183278]: 2026-01-21 18:17:06.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:17:07 compute-0 nova_compute[183278]: 2026-01-21 18:17:07.450 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:17:07 compute-0 nova_compute[183278]: 2026-01-21 18:17:07.451 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:17:07 compute-0 nova_compute[183278]: 2026-01-21 18:17:07.451 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:17:07 compute-0 nova_compute[183278]: 2026-01-21 18:17:07.451 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid b12214ad-ceca-4678-87c7-b9f991d6959e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:17:08 compute-0 nova_compute[183278]: 2026-01-21 18:17:08.968 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updating instance_info_cache with network_info: [{"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:17:09 compute-0 podman[205569]: 2026-01-21 18:17:09.004496375 +0000 UTC m=+0.062100723 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.265 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.265 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.266 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.266 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.266 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.327 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.543 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.543 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.543 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.544 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:17:09 compute-0 nova_compute[183278]: 2026-01-21 18:17:09.658 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.055 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.119 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.120 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.179 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.333 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.334 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5695MB free_disk=73.35465240478516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.334 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.334 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.424 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance b12214ad-ceca-4678-87c7-b9f991d6959e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.424 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.424 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.474 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.491 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.519 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:17:10 compute-0 nova_compute[183278]: 2026-01-21 18:17:10.519 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:17:11 compute-0 nova_compute[183278]: 2026-01-21 18:17:11.070 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:11 compute-0 nova_compute[183278]: 2026-01-21 18:17:11.070 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:11 compute-0 nova_compute[183278]: 2026-01-21 18:17:11.070 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:11 compute-0 nova_compute[183278]: 2026-01-21 18:17:11.071 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:17:12 compute-0 nova_compute[183278]: 2026-01-21 18:17:12.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:12 compute-0 nova_compute[183278]: 2026-01-21 18:17:12.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:13 compute-0 nova_compute[183278]: 2026-01-21 18:17:13.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:17:14 compute-0 nova_compute[183278]: 2026-01-21 18:17:14.329 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:14 compute-0 nova_compute[183278]: 2026-01-21 18:17:14.709 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:14 compute-0 podman[205603]: 2026-01-21 18:17:14.989412439 +0000 UTC m=+0.044650121 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:17:15 compute-0 podman[205602]: 2026-01-21 18:17:15.014381884 +0000 UTC m=+0.073263924 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 21 18:17:19 compute-0 nova_compute[183278]: 2026-01-21 18:17:19.331 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:19 compute-0 nova_compute[183278]: 2026-01-21 18:17:19.711 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:19 compute-0 podman[205645]: 2026-01-21 18:17:19.990494899 +0000 UTC m=+0.047708355 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:17:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:17:20.074 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:17:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:17:20.074 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:17:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:17:20.075 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:17:24 compute-0 nova_compute[183278]: 2026-01-21 18:17:24.333 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:24 compute-0 nova_compute[183278]: 2026-01-21 18:17:24.713 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:29 compute-0 nova_compute[183278]: 2026-01-21 18:17:29.336 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:29 compute-0 nova_compute[183278]: 2026-01-21 18:17:29.715 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:29 compute-0 podman[192560]: time="2026-01-21T18:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:17:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:17:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2634 "" "Go-http-client/1.1"
Jan 21 18:17:31 compute-0 openstack_network_exporter[195402]: ERROR   18:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:17:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:17:31 compute-0 openstack_network_exporter[195402]: ERROR   18:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:17:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:17:34 compute-0 nova_compute[183278]: 2026-01-21 18:17:34.338 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:34 compute-0 nova_compute[183278]: 2026-01-21 18:17:34.717 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:39 compute-0 nova_compute[183278]: 2026-01-21 18:17:39.340 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:39 compute-0 nova_compute[183278]: 2026-01-21 18:17:39.719 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:40 compute-0 podman[205669]: 2026-01-21 18:17:40.006553482 +0000 UTC m=+0.060359302 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:17:40 compute-0 nova_compute[183278]: 2026-01-21 18:17:40.966 183284 DEBUG nova.compute.manager [None req-f018bb07-ffbd-4481-a57d-be0331c647a2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 502e4243-611b-433d-a766-9b485d51652d in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Jan 21 18:17:41 compute-0 nova_compute[183278]: 2026-01-21 18:17:41.112 183284 DEBUG nova.compute.provider_tree [None req-f018bb07-ffbd-4481-a57d-be0331c647a2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Updating resource provider 502e4243-611b-433d-a766-9b485d51652d generation from 4 to 13 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 18:17:44 compute-0 nova_compute[183278]: 2026-01-21 18:17:44.368 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:44 compute-0 nova_compute[183278]: 2026-01-21 18:17:44.720 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:45 compute-0 podman[205692]: 2026-01-21 18:17:45.996227011 +0000 UTC m=+0.048819832 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:17:46 compute-0 podman[205691]: 2026-01-21 18:17:46.023634424 +0000 UTC m=+0.079670098 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 18:17:46 compute-0 nova_compute[183278]: 2026-01-21 18:17:46.983 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Check if temp file /var/lib/nova/instances/tmplz2tbm8l exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 18:17:46 compute-0 nova_compute[183278]: 2026-01-21 18:17:46.983 183284 DEBUG nova.compute.manager [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplz2tbm8l',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b12214ad-ceca-4678-87c7-b9f991d6959e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 18:17:48 compute-0 nova_compute[183278]: 2026-01-21 18:17:48.011 183284 DEBUG oslo_concurrency.processutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:17:48 compute-0 nova_compute[183278]: 2026-01-21 18:17:48.069 183284 DEBUG oslo_concurrency.processutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:17:48 compute-0 nova_compute[183278]: 2026-01-21 18:17:48.070 183284 DEBUG oslo_concurrency.processutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:17:48 compute-0 nova_compute[183278]: 2026-01-21 18:17:48.124 183284 DEBUG oslo_concurrency.processutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:17:49 compute-0 nova_compute[183278]: 2026-01-21 18:17:49.407 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:49 compute-0 nova_compute[183278]: 2026-01-21 18:17:49.723 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:50 compute-0 podman[205740]: 2026-01-21 18:17:50.986438458 +0000 UTC m=+0.047296686 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:17:54 compute-0 sshd-session[205764]: Accepted publickey for nova from 192.168.122.101 port 46818 ssh2: ECDSA SHA256:29a5JNhHHz2bb0ACqZTr6qOKeSRnhiTRA8SK+rzn9gs
Jan 21 18:17:54 compute-0 nova_compute[183278]: 2026-01-21 18:17:54.454 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:54 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:17:54 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:17:54 compute-0 systemd-logind[782]: New session 30 of user nova.
Jan 21 18:17:54 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:17:54 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:17:54 compute-0 systemd[205768]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:17:54 compute-0 systemd[205768]: Queued start job for default target Main User Target.
Jan 21 18:17:54 compute-0 systemd[205768]: Created slice User Application Slice.
Jan 21 18:17:54 compute-0 systemd[205768]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:17:54 compute-0 systemd[205768]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:17:54 compute-0 systemd[205768]: Reached target Paths.
Jan 21 18:17:54 compute-0 systemd[205768]: Reached target Timers.
Jan 21 18:17:54 compute-0 systemd[205768]: Starting D-Bus User Message Bus Socket...
Jan 21 18:17:54 compute-0 systemd[205768]: Starting Create User's Volatile Files and Directories...
Jan 21 18:17:54 compute-0 systemd[205768]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:17:54 compute-0 systemd[205768]: Reached target Sockets.
Jan 21 18:17:54 compute-0 systemd[205768]: Finished Create User's Volatile Files and Directories.
Jan 21 18:17:54 compute-0 systemd[205768]: Reached target Basic System.
Jan 21 18:17:54 compute-0 systemd[205768]: Reached target Main User Target.
Jan 21 18:17:54 compute-0 systemd[205768]: Startup finished in 129ms.
Jan 21 18:17:54 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:17:54 compute-0 systemd[1]: Started Session 30 of User nova.
Jan 21 18:17:54 compute-0 sshd-session[205764]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:17:54 compute-0 nova_compute[183278]: 2026-01-21 18:17:54.724 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:54 compute-0 sshd-session[205783]: Received disconnect from 192.168.122.101 port 46818:11: disconnected by user
Jan 21 18:17:54 compute-0 sshd-session[205783]: Disconnected from user nova 192.168.122.101 port 46818
Jan 21 18:17:54 compute-0 sshd-session[205764]: pam_unix(sshd:session): session closed for user nova
Jan 21 18:17:54 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Jan 21 18:17:54 compute-0 systemd-logind[782]: Session 30 logged out. Waiting for processes to exit.
Jan 21 18:17:54 compute-0 systemd-logind[782]: Removed session 30.
Jan 21 18:17:56 compute-0 nova_compute[183278]: 2026-01-21 18:17:56.787 183284 DEBUG nova.compute.manager [req-2e557dd9-e679-46b3-9cd3-03d900351e9e req-c05b533e-1982-4e80-8a8b-93bb09f85e05 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-unplugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:17:56 compute-0 nova_compute[183278]: 2026-01-21 18:17:56.788 183284 DEBUG oslo_concurrency.lockutils [req-2e557dd9-e679-46b3-9cd3-03d900351e9e req-c05b533e-1982-4e80-8a8b-93bb09f85e05 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:17:56 compute-0 nova_compute[183278]: 2026-01-21 18:17:56.789 183284 DEBUG oslo_concurrency.lockutils [req-2e557dd9-e679-46b3-9cd3-03d900351e9e req-c05b533e-1982-4e80-8a8b-93bb09f85e05 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:17:56 compute-0 nova_compute[183278]: 2026-01-21 18:17:56.789 183284 DEBUG oslo_concurrency.lockutils [req-2e557dd9-e679-46b3-9cd3-03d900351e9e req-c05b533e-1982-4e80-8a8b-93bb09f85e05 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:17:56 compute-0 nova_compute[183278]: 2026-01-21 18:17:56.790 183284 DEBUG nova.compute.manager [req-2e557dd9-e679-46b3-9cd3-03d900351e9e req-c05b533e-1982-4e80-8a8b-93bb09f85e05 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] No waiting events found dispatching network-vif-unplugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:17:56 compute-0 nova_compute[183278]: 2026-01-21 18:17:56.790 183284 DEBUG nova.compute.manager [req-2e557dd9-e679-46b3-9cd3-03d900351e9e req-c05b533e-1982-4e80-8a8b-93bb09f85e05 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-unplugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:17:58 compute-0 nova_compute[183278]: 2026-01-21 18:17:58.205 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:17:58.206 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:17:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:17:58.207 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.197 183284 DEBUG nova.compute.manager [req-eabf141b-1906-4da9-8912-9e82a66d21d7 req-bcf00321-196c-4aa3-9202-135594d6a68f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.197 183284 DEBUG oslo_concurrency.lockutils [req-eabf141b-1906-4da9-8912-9e82a66d21d7 req-bcf00321-196c-4aa3-9202-135594d6a68f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.198 183284 DEBUG oslo_concurrency.lockutils [req-eabf141b-1906-4da9-8912-9e82a66d21d7 req-bcf00321-196c-4aa3-9202-135594d6a68f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.198 183284 DEBUG oslo_concurrency.lockutils [req-eabf141b-1906-4da9-8912-9e82a66d21d7 req-bcf00321-196c-4aa3-9202-135594d6a68f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.198 183284 DEBUG nova.compute.manager [req-eabf141b-1906-4da9-8912-9e82a66d21d7 req-bcf00321-196c-4aa3-9202-135594d6a68f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] No waiting events found dispatching network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.198 183284 WARNING nova.compute.manager [req-eabf141b-1906-4da9-8912-9e82a66d21d7 req-bcf00321-196c-4aa3-9202-135594d6a68f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received unexpected event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 for instance with vm_state active and task_state migrating.
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.499 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.728 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:17:59 compute-0 podman[192560]: time="2026-01-21T18:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:17:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:17:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.845 183284 INFO nova.compute.manager [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Took 11.72 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.845 183284 DEBUG nova.compute.manager [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.862 183284 DEBUG nova.compute.manager [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplz2tbm8l',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b12214ad-ceca-4678-87c7-b9f991d6959e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(adf03938-f324-4805-bba7-808a0b06006d),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.881 183284 DEBUG nova.objects.instance [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid b12214ad-ceca-4678-87c7-b9f991d6959e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.882 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.883 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.883 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.950 183284 DEBUG nova.virt.libvirt.vif [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:16:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-81450804',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-81450804',id=7,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:16:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='aeb82e0980254fc885bc0eaa70c4cc68',ramdisk_id='',reservation_id='r-kywgsf09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-477892887',owner_user_name='tempest-TestExecuteBasicStrategy-477892887-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:16:11Z,user_data=None,user_id='c22533b2094d465a9fc14ed57c562c02',uuid=b12214ad-ceca-4678-87c7-b9f991d6959e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.950 183284 DEBUG nova.network.os_vif_util [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.951 183284 DEBUG nova.network.os_vif_util [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:8d:04,bridge_name='br-int',has_traffic_filtering=True,id=d068a9ac-1495-43a0-9d00-8867b0e13f03,network=Network(b9000416-b201-46fa-8852-f77e5f1d4714),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd068a9ac-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.952 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 18:17:59 compute-0 nova_compute[183278]:   <mac address="fa:16:3e:34:8d:04"/>
Jan 21 18:17:59 compute-0 nova_compute[183278]:   <model type="virtio"/>
Jan 21 18:17:59 compute-0 nova_compute[183278]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:17:59 compute-0 nova_compute[183278]:   <mtu size="1442"/>
Jan 21 18:17:59 compute-0 nova_compute[183278]:   <target dev="tapd068a9ac-14"/>
Jan 21 18:17:59 compute-0 nova_compute[183278]: </interface>
Jan 21 18:17:59 compute-0 nova_compute[183278]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 18:17:59 compute-0 nova_compute[183278]: 2026-01-21 18:17:59.952 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 18:18:00 compute-0 nova_compute[183278]: 2026-01-21 18:18:00.386 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:00 compute-0 nova_compute[183278]: 2026-01-21 18:18:00.386 183284 INFO nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 18:18:00 compute-0 nova_compute[183278]: 2026-01-21 18:18:00.523 183284 INFO nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 18:18:01 compute-0 nova_compute[183278]: 2026-01-21 18:18:01.026 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:01 compute-0 nova_compute[183278]: 2026-01-21 18:18:01.026 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:01 compute-0 openstack_network_exporter[195402]: ERROR   18:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:18:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:18:01 compute-0 openstack_network_exporter[195402]: ERROR   18:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:18:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:18:01 compute-0 nova_compute[183278]: 2026-01-21 18:18:01.443 183284 DEBUG nova.compute.manager [req-8362a068-ea35-41ed-95ba-7b354ecfd768 req-d4636612-de2d-45cf-b23e-a6956977dd77 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-changed-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:18:01 compute-0 nova_compute[183278]: 2026-01-21 18:18:01.443 183284 DEBUG nova.compute.manager [req-8362a068-ea35-41ed-95ba-7b354ecfd768 req-d4636612-de2d-45cf-b23e-a6956977dd77 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Refreshing instance network info cache due to event network-changed-d068a9ac-1495-43a0-9d00-8867b0e13f03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:18:01 compute-0 nova_compute[183278]: 2026-01-21 18:18:01.444 183284 DEBUG oslo_concurrency.lockutils [req-8362a068-ea35-41ed-95ba-7b354ecfd768 req-d4636612-de2d-45cf-b23e-a6956977dd77 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:18:01 compute-0 nova_compute[183278]: 2026-01-21 18:18:01.444 183284 DEBUG oslo_concurrency.lockutils [req-8362a068-ea35-41ed-95ba-7b354ecfd768 req-d4636612-de2d-45cf-b23e-a6956977dd77 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:18:01 compute-0 nova_compute[183278]: 2026-01-21 18:18:01.445 183284 DEBUG nova.network.neutron [req-8362a068-ea35-41ed-95ba-7b354ecfd768 req-d4636612-de2d-45cf-b23e-a6956977dd77 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Refreshing network info cache for port d068a9ac-1495-43a0-9d00-8867b0e13f03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:18:01 compute-0 nova_compute[183278]: 2026-01-21 18:18:01.529 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:01 compute-0 nova_compute[183278]: 2026-01-21 18:18:01.529 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:02 compute-0 nova_compute[183278]: 2026-01-21 18:18:02.033 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:02 compute-0 nova_compute[183278]: 2026-01-21 18:18:02.034 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:02 compute-0 nova_compute[183278]: 2026-01-21 18:18:02.538 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:02 compute-0 nova_compute[183278]: 2026-01-21 18:18:02.538 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:03 compute-0 nova_compute[183278]: 2026-01-21 18:18:03.045 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:03 compute-0 nova_compute[183278]: 2026-01-21 18:18:03.045 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:03 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:03.209 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:18:03 compute-0 nova_compute[183278]: 2026-01-21 18:18:03.365 183284 DEBUG nova.network.neutron [req-8362a068-ea35-41ed-95ba-7b354ecfd768 req-d4636612-de2d-45cf-b23e-a6956977dd77 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updated VIF entry in instance network info cache for port d068a9ac-1495-43a0-9d00-8867b0e13f03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:18:03 compute-0 nova_compute[183278]: 2026-01-21 18:18:03.366 183284 DEBUG nova.network.neutron [req-8362a068-ea35-41ed-95ba-7b354ecfd768 req-d4636612-de2d-45cf-b23e-a6956977dd77 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updating instance_info_cache with network_info: [{"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:18:03 compute-0 nova_compute[183278]: 2026-01-21 18:18:03.504 183284 DEBUG oslo_concurrency.lockutils [req-8362a068-ea35-41ed-95ba-7b354ecfd768 req-d4636612-de2d-45cf-b23e-a6956977dd77 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:18:03 compute-0 nova_compute[183278]: 2026-01-21 18:18:03.548 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:03 compute-0 nova_compute[183278]: 2026-01-21 18:18:03.549 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:04 compute-0 nova_compute[183278]: 2026-01-21 18:18:04.052 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:04 compute-0 nova_compute[183278]: 2026-01-21 18:18:04.052 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:04 compute-0 nova_compute[183278]: 2026-01-21 18:18:04.501 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:04 compute-0 nova_compute[183278]: 2026-01-21 18:18:04.555 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:04 compute-0 nova_compute[183278]: 2026-01-21 18:18:04.556 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:04 compute-0 nova_compute[183278]: 2026-01-21 18:18:04.729 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:04 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:18:04 compute-0 systemd[205768]: Activating special unit Exit the Session...
Jan 21 18:18:04 compute-0 systemd[205768]: Stopped target Main User Target.
Jan 21 18:18:04 compute-0 systemd[205768]: Stopped target Basic System.
Jan 21 18:18:04 compute-0 systemd[205768]: Stopped target Paths.
Jan 21 18:18:04 compute-0 systemd[205768]: Stopped target Sockets.
Jan 21 18:18:04 compute-0 systemd[205768]: Stopped target Timers.
Jan 21 18:18:04 compute-0 systemd[205768]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:18:04 compute-0 systemd[205768]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:18:04 compute-0 systemd[205768]: Closed D-Bus User Message Bus Socket.
Jan 21 18:18:04 compute-0 systemd[205768]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:18:04 compute-0 systemd[205768]: Removed slice User Application Slice.
Jan 21 18:18:04 compute-0 systemd[205768]: Reached target Shutdown.
Jan 21 18:18:04 compute-0 systemd[205768]: Finished Exit the Session.
Jan 21 18:18:04 compute-0 systemd[205768]: Reached target Exit the Session.
Jan 21 18:18:04 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:18:04 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:18:04 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:18:04 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:18:04 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:18:04 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:18:04 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.061 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 5 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.062 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:05 compute-0 sshd-session[205806]: Invalid user ansible_user from 64.227.98.100 port 50558
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.274 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019485.273796, b12214ad-ceca-4678-87c7-b9f991d6959e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.274 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] VM Paused (Lifecycle Event)
Jan 21 18:18:05 compute-0 sshd-session[205806]: Connection closed by invalid user ansible_user 64.227.98.100 port 50558 [preauth]
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.365 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.372 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.418 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.565 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Current 50 elapsed 5 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.566 183284 DEBUG nova.virt.libvirt.migration [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.872 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Migration running for 5 secs, memory 0% remaining (bytes processed=111660720, remaining=0, total=143466496); disk 0% remaining (bytes processed=75431936, remaining=0, total=75431936). _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10531
Jan 21 18:18:05 compute-0 kernel: tapd068a9ac-14 (unregistering): left promiscuous mode
Jan 21 18:18:05 compute-0 NetworkManager[55506]: <info>  [1769019485.9760] device (tapd068a9ac-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:18:05 compute-0 ovn_controller[95419]: 2026-01-21T18:18:05Z|00065|binding|INFO|Releasing lport d068a9ac-1495-43a0-9d00-8867b0e13f03 from this chassis (sb_readonly=0)
Jan 21 18:18:05 compute-0 ovn_controller[95419]: 2026-01-21T18:18:05Z|00066|binding|INFO|Setting lport d068a9ac-1495-43a0-9d00-8867b0e13f03 down in Southbound
Jan 21 18:18:05 compute-0 ovn_controller[95419]: 2026-01-21T18:18:05Z|00067|binding|INFO|Removing iface tapd068a9ac-14 ovn-installed in OVS
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.985 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:05 compute-0 nova_compute[183278]: 2026-01-21 18:18:05.987 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.001 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:06 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 21 18:18:06 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 17.903s CPU time.
Jan 21 18:18:06 compute-0 systemd-machined[154592]: Machine qemu-5-instance-00000007 terminated.
Jan 21 18:18:06 compute-0 ovn_controller[95419]: 2026-01-21T18:18:06Z|00068|binding|INFO|Releasing lport ca79930b-5c55-4671-8350-d51db8c90360 from this chassis (sb_readonly=0)
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.072 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:8d:04 10.100.0.12'], port_security=['fa:16:3e:34:8d:04 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '88a62794-b4a4-47e3-9cce-91e574e684c1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b12214ad-ceca-4678-87c7-b9f991d6959e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b9000416-b201-46fa-8852-f77e5f1d4714', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aeb82e0980254fc885bc0eaa70c4cc68', 'neutron:revision_number': '8', 'neutron:security_group_ids': '823f317c-7d21-4ee3-849b-1072420150d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a49360a-b626-4349-8811-d78c04e8937e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=d068a9ac-1495-43a0-9d00-8867b0e13f03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.073 104698 INFO neutron.agent.ovn.metadata.agent [-] Port d068a9ac-1495-43a0-9d00-8867b0e13f03 in datapath b9000416-b201-46fa-8852-f77e5f1d4714 unbound from our chassis
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.074 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b9000416-b201-46fa-8852-f77e5f1d4714, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.076 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[98393eb6-b070-4225-9387-d7c61398a6ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.076 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714 namespace which is not needed anymore
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.113 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:06 compute-0 neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714[205373]: [NOTICE]   (205377) : haproxy version is 2.8.14-c23fe91
Jan 21 18:18:06 compute-0 neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714[205373]: [NOTICE]   (205377) : path to executable is /usr/sbin/haproxy
Jan 21 18:18:06 compute-0 neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714[205373]: [WARNING]  (205377) : Exiting Master process...
Jan 21 18:18:06 compute-0 neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714[205373]: [ALERT]    (205377) : Current worker (205379) exited with code 143 (Terminated)
Jan 21 18:18:06 compute-0 neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714[205373]: [WARNING]  (205377) : All workers exited. Exiting... (0)
Jan 21 18:18:06 compute-0 systemd[1]: libpod-d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9.scope: Deactivated successfully.
Jan 21 18:18:06 compute-0 podman[205833]: 2026-01-21 18:18:06.212356414 +0000 UTC m=+0.047638655 container died d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.212 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.213 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.213 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 18:18:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-83f23d12cef94d6e22833c0cb8e91c0e550bbf9140b2df35e0707f1d315c39e7-merged.mount: Deactivated successfully.
Jan 21 18:18:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9-userdata-shm.mount: Deactivated successfully.
Jan 21 18:18:06 compute-0 podman[205833]: 2026-01-21 18:18:06.249643205 +0000 UTC m=+0.084925436 container cleanup d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 18:18:06 compute-0 systemd[1]: libpod-conmon-d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9.scope: Deactivated successfully.
Jan 21 18:18:06 compute-0 podman[205880]: 2026-01-21 18:18:06.30438692 +0000 UTC m=+0.036396092 container remove d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.308 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e4919492-6e17-4fb9-9fab-7a1f13028e5a]: (4, ('Wed Jan 21 06:18:06 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714 (d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9)\nd104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9\nWed Jan 21 06:18:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714 (d104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9)\nd104471c7c62f1d0fd3d2f07d368eaafca942cb97befdd155d3ea9625a5fc1a9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.310 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[05afc2e8-890f-44d8-9849-7c156d34309f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.311 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9000416-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.313 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:06 compute-0 kernel: tapb9000416-b0: left promiscuous mode
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.329 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.332 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e5af854d-f90e-43dc-8d92-0651c35811c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.352 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7ddab9-5dab-4676-8503-f4e372011575]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.353 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9b1d73-bc34-4ee2-b828-7552d5f14a7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.371 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[81c9972d-474e-4a64-bf35-0a98cb5007ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393449, 'reachable_time': 27198, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205897, 'error': None, 'target': 'ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:18:06 compute-0 systemd[1]: run-netns-ovnmeta\x2db9000416\x2db201\x2d46fa\x2d8852\x2df77e5f1d4714.mount: Deactivated successfully.
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.376 183284 DEBUG nova.virt.libvirt.guest [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'b12214ad-ceca-4678-87c7-b9f991d6959e' (instance-00000007) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.377 183284 INFO nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Migration operation has completed
Jan 21 18:18:06 compute-0 nova_compute[183278]: 2026-01-21 18:18:06.377 183284 INFO nova.compute.manager [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] _post_live_migration() is started..
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.376 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b9000416-b201-46fa-8852-f77e5f1d4714 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:18:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:06.377 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b46b0d-0866-4739-9dbc-831dc654fe08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.394 183284 DEBUG nova.compute.manager [req-a4403ce5-60e1-4f5c-8bc9-676af211644a req-3e6fea27-eb60-4ead-b352-c4aca1474e59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-unplugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.394 183284 DEBUG oslo_concurrency.lockutils [req-a4403ce5-60e1-4f5c-8bc9-676af211644a req-3e6fea27-eb60-4ead-b352-c4aca1474e59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.395 183284 DEBUG oslo_concurrency.lockutils [req-a4403ce5-60e1-4f5c-8bc9-676af211644a req-3e6fea27-eb60-4ead-b352-c4aca1474e59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.395 183284 DEBUG oslo_concurrency.lockutils [req-a4403ce5-60e1-4f5c-8bc9-676af211644a req-3e6fea27-eb60-4ead-b352-c4aca1474e59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.395 183284 DEBUG nova.compute.manager [req-a4403ce5-60e1-4f5c-8bc9-676af211644a req-3e6fea27-eb60-4ead-b352-c4aca1474e59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] No waiting events found dispatching network-vif-unplugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.395 183284 DEBUG nova.compute.manager [req-a4403ce5-60e1-4f5c-8bc9-676af211644a req-3e6fea27-eb60-4ead-b352-c4aca1474e59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-unplugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.700 183284 DEBUG nova.network.neutron [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Activated binding for port d068a9ac-1495-43a0-9d00-8867b0e13f03 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.700 183284 DEBUG nova.compute.manager [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.701 183284 DEBUG nova.virt.libvirt.vif [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:16:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-81450804',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-81450804',id=7,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:16:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=100,project_id='aeb82e0980254fc885bc0eaa70c4cc68',ramdisk_id='',reservation_id='r-kywgsf09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-477892887',owner_user_name='tempest-TestExecuteBasicStrategy-477892887-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:18:00Z,user_data=None,user_id='c22533b2094d465a9fc14ed57c562c02',uuid=b12214ad-ceca-4678-87c7-b9f991d6959e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.701 183284 DEBUG nova.network.os_vif_util [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.702 183284 DEBUG nova.network.os_vif_util [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:8d:04,bridge_name='br-int',has_traffic_filtering=True,id=d068a9ac-1495-43a0-9d00-8867b0e13f03,network=Network(b9000416-b201-46fa-8852-f77e5f1d4714),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd068a9ac-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.702 183284 DEBUG os_vif [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:8d:04,bridge_name='br-int',has_traffic_filtering=True,id=d068a9ac-1495-43a0-9d00-8867b0e13f03,network=Network(b9000416-b201-46fa-8852-f77e5f1d4714),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd068a9ac-14') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.703 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.704 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd068a9ac-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.705 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.706 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.708 183284 INFO os_vif [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:8d:04,bridge_name='br-int',has_traffic_filtering=True,id=d068a9ac-1495-43a0-9d00-8867b0e13f03,network=Network(b9000416-b201-46fa-8852-f77e5f1d4714),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd068a9ac-14')
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.708 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.708 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.709 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.709 183284 DEBUG nova.compute.manager [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.709 183284 INFO nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Deleting instance files /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e_del
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.710 183284 INFO nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Deletion of /var/lib/nova/instances/b12214ad-ceca-4678-87c7-b9f991d6959e_del complete
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:18:07 compute-0 nova_compute[183278]: 2026-01-21 18:18:07.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.111 183284 DEBUG nova.compute.manager [req-46c45ed7-ed1e-4d34-add0-a848365401e1 req-05aff1ee-9254-451f-9a0a-fef84793846e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.111 183284 DEBUG oslo_concurrency.lockutils [req-46c45ed7-ed1e-4d34-add0-a848365401e1 req-05aff1ee-9254-451f-9a0a-fef84793846e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.112 183284 DEBUG oslo_concurrency.lockutils [req-46c45ed7-ed1e-4d34-add0-a848365401e1 req-05aff1ee-9254-451f-9a0a-fef84793846e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.112 183284 DEBUG oslo_concurrency.lockutils [req-46c45ed7-ed1e-4d34-add0-a848365401e1 req-05aff1ee-9254-451f-9a0a-fef84793846e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.112 183284 DEBUG nova.compute.manager [req-46c45ed7-ed1e-4d34-add0-a848365401e1 req-05aff1ee-9254-451f-9a0a-fef84793846e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] No waiting events found dispatching network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.112 183284 WARNING nova.compute.manager [req-46c45ed7-ed1e-4d34-add0-a848365401e1 req-05aff1ee-9254-451f-9a0a-fef84793846e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received unexpected event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 for instance with vm_state active and task_state migrating.
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.131 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.131 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.131 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:18:08 compute-0 nova_compute[183278]: 2026-01-21 18:18:08.131 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid b12214ad-ceca-4678-87c7-b9f991d6959e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:18:09 compute-0 nova_compute[183278]: 2026-01-21 18:18:09.504 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:09 compute-0 nova_compute[183278]: 2026-01-21 18:18:09.711 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updating instance_info_cache with network_info: [{"id": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "address": "fa:16:3e:34:8d:04", "network": {"id": "b9000416-b201-46fa-8852-f77e5f1d4714", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1785313754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "aeb82e0980254fc885bc0eaa70c4cc68", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd068a9ac-14", "ovs_interfaceid": "d068a9ac-1495-43a0-9d00-8867b0e13f03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.203 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-b12214ad-ceca-4678-87c7-b9f991d6959e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.204 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.204 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.205 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.205 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.310 183284 DEBUG nova.compute.manager [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.311 183284 DEBUG oslo_concurrency.lockutils [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.311 183284 DEBUG oslo_concurrency.lockutils [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.311 183284 DEBUG oslo_concurrency.lockutils [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.311 183284 DEBUG nova.compute.manager [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] No waiting events found dispatching network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.312 183284 WARNING nova.compute.manager [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received unexpected event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 for instance with vm_state active and task_state migrating.
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.312 183284 DEBUG nova.compute.manager [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.312 183284 DEBUG oslo_concurrency.lockutils [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.312 183284 DEBUG oslo_concurrency.lockutils [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.313 183284 DEBUG oslo_concurrency.lockutils [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.313 183284 DEBUG nova.compute.manager [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] No waiting events found dispatching network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.313 183284 WARNING nova.compute.manager [req-22f8c036-e653-4001-b9e9-05c09b33a706 req-b1554ccf-94a6-416a-92bc-7db746268757 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Received unexpected event network-vif-plugged-d068a9ac-1495-43a0-9d00-8867b0e13f03 for instance with vm_state active and task_state migrating.
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.321 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.322 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.322 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.323 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:18:10 compute-0 podman[205900]: 2026-01-21 18:18:10.436853183 +0000 UTC m=+0.069073654 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.487 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.488 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5885MB free_disk=73.38327026367188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.488 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.488 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.629 183284 INFO nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Updating resource usage from migration adf03938-f324-4805-bba7-808a0b06006d
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.666 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Migration adf03938-f324-4805-bba7-808a0b06006d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.666 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.667 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.681 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing inventories for resource provider 502e4243-611b-433d-a766-9b485d51652d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.696 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating ProviderTree inventory for provider 502e4243-611b-433d-a766-9b485d51652d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.696 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.713 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing aggregate associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.733 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing trait associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STATUS_DISABLED,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 18:18:10 compute-0 nova_compute[183278]: 2026-01-21 18:18:10.780 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:18:11 compute-0 nova_compute[183278]: 2026-01-21 18:18:11.143 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:18:11 compute-0 nova_compute[183278]: 2026-01-21 18:18:11.686 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:18:11 compute-0 nova_compute[183278]: 2026-01-21 18:18:11.687 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:12 compute-0 nova_compute[183278]: 2026-01-21 18:18:12.298 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:18:12 compute-0 nova_compute[183278]: 2026-01-21 18:18:12.298 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:18:12 compute-0 nova_compute[183278]: 2026-01-21 18:18:12.299 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:18:12 compute-0 nova_compute[183278]: 2026-01-21 18:18:12.299 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:18:12 compute-0 nova_compute[183278]: 2026-01-21 18:18:12.801 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:13 compute-0 nova_compute[183278]: 2026-01-21 18:18:13.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:18:14 compute-0 nova_compute[183278]: 2026-01-21 18:18:14.505 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:14 compute-0 nova_compute[183278]: 2026-01-21 18:18:14.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.765 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.765 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.765 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b12214ad-ceca-4678-87c7-b9f991d6959e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.789 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.790 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.790 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.790 183284 DEBUG nova.compute.resource_tracker [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.930 183284 WARNING nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.931 183284 DEBUG nova.compute.resource_tracker [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5885MB free_disk=73.38327026367188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.931 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:15 compute-0 nova_compute[183278]: 2026-01-21 18:18:15.932 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:16 compute-0 nova_compute[183278]: 2026-01-21 18:18:16.704 183284 DEBUG nova.compute.resource_tracker [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration for instance b12214ad-ceca-4678-87c7-b9f991d6959e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:18:16 compute-0 nova_compute[183278]: 2026-01-21 18:18:16.979 183284 DEBUG nova.compute.resource_tracker [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 18:18:17 compute-0 podman[205923]: 2026-01-21 18:18:17.002817146 +0000 UTC m=+0.053175528 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:18:17 compute-0 nova_compute[183278]: 2026-01-21 18:18:17.008 183284 DEBUG nova.compute.resource_tracker [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration adf03938-f324-4805-bba7-808a0b06006d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:18:17 compute-0 nova_compute[183278]: 2026-01-21 18:18:17.009 183284 DEBUG nova.compute.resource_tracker [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:18:17 compute-0 nova_compute[183278]: 2026-01-21 18:18:17.009 183284 DEBUG nova.compute.resource_tracker [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:18:17 compute-0 podman[205922]: 2026-01-21 18:18:17.027574155 +0000 UTC m=+0.081736628 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 18:18:17 compute-0 nova_compute[183278]: 2026-01-21 18:18:17.044 183284 DEBUG nova.compute.provider_tree [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:18:17 compute-0 nova_compute[183278]: 2026-01-21 18:18:17.315 183284 DEBUG nova.scheduler.client.report [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:18:17 compute-0 nova_compute[183278]: 2026-01-21 18:18:17.803 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:17 compute-0 nova_compute[183278]: 2026-01-21 18:18:17.806 183284 DEBUG nova.compute.resource_tracker [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:18:17 compute-0 nova_compute[183278]: 2026-01-21 18:18:17.807 183284 DEBUG oslo_concurrency.lockutils [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:17 compute-0 nova_compute[183278]: 2026-01-21 18:18:17.811 183284 INFO nova.compute.manager [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 18:18:18 compute-0 nova_compute[183278]: 2026-01-21 18:18:18.890 183284 INFO nova.scheduler.client.report [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Deleted allocation for migration adf03938-f324-4805-bba7-808a0b06006d
Jan 21 18:18:18 compute-0 nova_compute[183278]: 2026-01-21 18:18:18.891 183284 DEBUG nova.virt.libvirt.driver [None req-0989439e-a211-47bb-b4e2-d474fcafc4c2 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 18:18:19 compute-0 nova_compute[183278]: 2026-01-21 18:18:19.507 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:20.075 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:18:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:20.075 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:18:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:18:20.075 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:18:21 compute-0 nova_compute[183278]: 2026-01-21 18:18:21.210 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769019486.209003, b12214ad-ceca-4678-87c7-b9f991d6959e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:18:21 compute-0 nova_compute[183278]: 2026-01-21 18:18:21.210 183284 INFO nova.compute.manager [-] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] VM Stopped (Lifecycle Event)
Jan 21 18:18:21 compute-0 nova_compute[183278]: 2026-01-21 18:18:21.334 183284 DEBUG nova.compute.manager [None req-dd8857cf-66bd-445b-ba5e-571eff221906 - - - - - -] [instance: b12214ad-ceca-4678-87c7-b9f991d6959e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:18:21 compute-0 podman[205963]: 2026-01-21 18:18:21.995324098 +0000 UTC m=+0.055055714 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:18:22 compute-0 nova_compute[183278]: 2026-01-21 18:18:22.805 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:24 compute-0 nova_compute[183278]: 2026-01-21 18:18:24.510 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:27 compute-0 nova_compute[183278]: 2026-01-21 18:18:27.807 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:29 compute-0 nova_compute[183278]: 2026-01-21 18:18:29.543 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:29 compute-0 podman[192560]: time="2026-01-21T18:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:18:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:18:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Jan 21 18:18:31 compute-0 openstack_network_exporter[195402]: ERROR   18:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:18:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:18:31 compute-0 openstack_network_exporter[195402]: ERROR   18:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:18:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:18:32 compute-0 nova_compute[183278]: 2026-01-21 18:18:32.808 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:34 compute-0 nova_compute[183278]: 2026-01-21 18:18:34.545 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:37 compute-0 nova_compute[183278]: 2026-01-21 18:18:37.810 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:39 compute-0 nova_compute[183278]: 2026-01-21 18:18:39.547 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:40 compute-0 podman[205988]: 2026-01-21 18:18:40.989370988 +0000 UTC m=+0.048999686 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter)
Jan 21 18:18:42 compute-0 nova_compute[183278]: 2026-01-21 18:18:42.812 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:44 compute-0 nova_compute[183278]: 2026-01-21 18:18:44.548 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:47 compute-0 nova_compute[183278]: 2026-01-21 18:18:47.814 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:47 compute-0 podman[206010]: 2026-01-21 18:18:47.994225073 +0000 UTC m=+0.051160529 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:18:48 compute-0 podman[206009]: 2026-01-21 18:18:48.024387858 +0000 UTC m=+0.084611994 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 18:18:49 compute-0 nova_compute[183278]: 2026-01-21 18:18:49.550 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:52 compute-0 nova_compute[183278]: 2026-01-21 18:18:52.859 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:52 compute-0 podman[206053]: 2026-01-21 18:18:52.9948286 +0000 UTC m=+0.056498319 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:18:54 compute-0 ovn_controller[95419]: 2026-01-21T18:18:54Z|00069|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 21 18:18:54 compute-0 nova_compute[183278]: 2026-01-21 18:18:54.550 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:57 compute-0 nova_compute[183278]: 2026-01-21 18:18:57.861 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:59 compute-0 nova_compute[183278]: 2026-01-21 18:18:59.552 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:18:59 compute-0 podman[192560]: time="2026-01-21T18:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:18:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:18:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Jan 21 18:19:01 compute-0 openstack_network_exporter[195402]: ERROR   18:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:19:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:19:01 compute-0 openstack_network_exporter[195402]: ERROR   18:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:19:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:19:02 compute-0 nova_compute[183278]: 2026-01-21 18:19:02.863 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:04 compute-0 nova_compute[183278]: 2026-01-21 18:19:04.553 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:07 compute-0 nova_compute[183278]: 2026-01-21 18:19:07.865 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:08 compute-0 nova_compute[183278]: 2026-01-21 18:19:08.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:08 compute-0 nova_compute[183278]: 2026-01-21 18:19:08.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:19:08 compute-0 nova_compute[183278]: 2026-01-21 18:19:08.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:19:08 compute-0 nova_compute[183278]: 2026-01-21 18:19:08.920 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.553 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.845 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.845 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.846 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.846 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.978 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.979 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5891MB free_disk=73.38157653808594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.979 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:19:09 compute-0 nova_compute[183278]: 2026-01-21 18:19:09.979 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:19:10 compute-0 nova_compute[183278]: 2026-01-21 18:19:10.044 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:19:10 compute-0 nova_compute[183278]: 2026-01-21 18:19:10.044 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:19:10 compute-0 nova_compute[183278]: 2026-01-21 18:19:10.063 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:19:10 compute-0 nova_compute[183278]: 2026-01-21 18:19:10.076 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:19:10 compute-0 nova_compute[183278]: 2026-01-21 18:19:10.077 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:19:10 compute-0 nova_compute[183278]: 2026-01-21 18:19:10.078 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:19:11 compute-0 nova_compute[183278]: 2026-01-21 18:19:11.078 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:11 compute-0 nova_compute[183278]: 2026-01-21 18:19:11.079 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:11 compute-0 nova_compute[183278]: 2026-01-21 18:19:11.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:11 compute-0 nova_compute[183278]: 2026-01-21 18:19:11.815 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:11 compute-0 nova_compute[183278]: 2026-01-21 18:19:11.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:19:11 compute-0 podman[206079]: 2026-01-21 18:19:11.998557677 +0000 UTC m=+0.055904624 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, config_id=openstack_network_exporter)
Jan 21 18:19:12 compute-0 nova_compute[183278]: 2026-01-21 18:19:12.867 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:13 compute-0 nova_compute[183278]: 2026-01-21 18:19:13.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:14 compute-0 nova_compute[183278]: 2026-01-21 18:19:14.554 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:14 compute-0 nova_compute[183278]: 2026-01-21 18:19:14.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:15 compute-0 nova_compute[183278]: 2026-01-21 18:19:15.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:16 compute-0 nova_compute[183278]: 2026-01-21 18:19:16.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:19:17 compute-0 nova_compute[183278]: 2026-01-21 18:19:17.869 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:18 compute-0 podman[206099]: 2026-01-21 18:19:18.992274171 +0000 UTC m=+0.047679394 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:19:19 compute-0 podman[206098]: 2026-01-21 18:19:19.025594674 +0000 UTC m=+0.084977194 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 18:19:19 compute-0 nova_compute[183278]: 2026-01-21 18:19:19.557 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:19:20.075 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:19:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:19:20.076 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:19:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:19:20.076 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:19:22 compute-0 nova_compute[183278]: 2026-01-21 18:19:22.871 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:23 compute-0 podman[206144]: 2026-01-21 18:19:23.987410435 +0000 UTC m=+0.048626097 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:19:24 compute-0 nova_compute[183278]: 2026-01-21 18:19:24.558 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:27 compute-0 nova_compute[183278]: 2026-01-21 18:19:27.873 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:28 compute-0 nova_compute[183278]: 2026-01-21 18:19:28.448 183284 DEBUG nova.compute.manager [None req-cb369d23-bc98-4ebc-bc86-d39ff3bd0737 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 502e4243-611b-433d-a766-9b485d51652d in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Jan 21 18:19:28 compute-0 nova_compute[183278]: 2026-01-21 18:19:28.524 183284 DEBUG nova.compute.provider_tree [None req-cb369d23-bc98-4ebc-bc86-d39ff3bd0737 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Updating resource provider 502e4243-611b-433d-a766-9b485d51652d generation from 14 to 16 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 18:19:29 compute-0 nova_compute[183278]: 2026-01-21 18:19:29.609 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:29 compute-0 podman[192560]: time="2026-01-21T18:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:19:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:19:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Jan 21 18:19:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:19:30.551 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:19:30 compute-0 nova_compute[183278]: 2026-01-21 18:19:30.552 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:19:30.552 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:19:31 compute-0 openstack_network_exporter[195402]: ERROR   18:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:19:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:19:31 compute-0 openstack_network_exporter[195402]: ERROR   18:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:19:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:19:32 compute-0 nova_compute[183278]: 2026-01-21 18:19:32.918 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:34 compute-0 nova_compute[183278]: 2026-01-21 18:19:34.610 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:37 compute-0 nova_compute[183278]: 2026-01-21 18:19:37.920 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:38 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:19:38.554 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:19:39 compute-0 nova_compute[183278]: 2026-01-21 18:19:39.612 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:42 compute-0 nova_compute[183278]: 2026-01-21 18:19:42.922 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:43 compute-0 podman[206170]: 2026-01-21 18:19:43.024546928 +0000 UTC m=+0.078845624 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 18:19:44 compute-0 nova_compute[183278]: 2026-01-21 18:19:44.654 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:47 compute-0 nova_compute[183278]: 2026-01-21 18:19:47.925 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:49 compute-0 nova_compute[183278]: 2026-01-21 18:19:49.692 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:49 compute-0 podman[206192]: 2026-01-21 18:19:49.99037402 +0000 UTC m=+0.047007657 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:19:50 compute-0 podman[206191]: 2026-01-21 18:19:50.014461437 +0000 UTC m=+0.074706273 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 21 18:19:52 compute-0 nova_compute[183278]: 2026-01-21 18:19:52.927 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:54 compute-0 nova_compute[183278]: 2026-01-21 18:19:54.694 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:54 compute-0 podman[206237]: 2026-01-21 18:19:54.991807428 +0000 UTC m=+0.051191290 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:19:57 compute-0 nova_compute[183278]: 2026-01-21 18:19:57.930 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:59 compute-0 podman[192560]: time="2026-01-21T18:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:19:59 compute-0 nova_compute[183278]: 2026-01-21 18:19:59.918 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:19:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:19:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Jan 21 18:20:01 compute-0 openstack_network_exporter[195402]: ERROR   18:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:20:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:20:01 compute-0 openstack_network_exporter[195402]: ERROR   18:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:20:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:20:02 compute-0 nova_compute[183278]: 2026-01-21 18:20:02.931 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:04 compute-0 nova_compute[183278]: 2026-01-21 18:20:04.921 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:06 compute-0 nova_compute[183278]: 2026-01-21 18:20:06.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:06 compute-0 nova_compute[183278]: 2026-01-21 18:20:06.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 18:20:07 compute-0 nova_compute[183278]: 2026-01-21 18:20:07.946 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:09 compute-0 nova_compute[183278]: 2026-01-21 18:20:09.922 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:10 compute-0 nova_compute[183278]: 2026-01-21 18:20:10.833 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:10 compute-0 nova_compute[183278]: 2026-01-21 18:20:10.833 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:20:10 compute-0 nova_compute[183278]: 2026-01-21 18:20:10.834 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:20:10 compute-0 nova_compute[183278]: 2026-01-21 18:20:10.848 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:20:10 compute-0 nova_compute[183278]: 2026-01-21 18:20:10.848 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:10 compute-0 nova_compute[183278]: 2026-01-21 18:20:10.876 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:20:10 compute-0 nova_compute[183278]: 2026-01-21 18:20:10.876 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:20:10 compute-0 nova_compute[183278]: 2026-01-21 18:20:10.876 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:20:10 compute-0 nova_compute[183278]: 2026-01-21 18:20:10.877 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:20:10 compute-0 ovn_controller[95419]: 2026-01-21T18:20:10Z|00070|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.018 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.019 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5887MB free_disk=73.38159561157227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.019 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.019 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.198 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.198 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.255 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.270 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.272 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:20:11 compute-0 nova_compute[183278]: 2026-01-21 18:20:11.272 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:20:12 compute-0 sshd-session[206262]: Invalid user eigenlayer from 64.227.98.100 port 47872
Jan 21 18:20:12 compute-0 sshd-session[206262]: Connection closed by invalid user eigenlayer 64.227.98.100 port 47872 [preauth]
Jan 21 18:20:12 compute-0 nova_compute[183278]: 2026-01-21 18:20:12.992 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:13 compute-0 nova_compute[183278]: 2026-01-21 18:20:13.240 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:13 compute-0 nova_compute[183278]: 2026-01-21 18:20:13.240 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:13 compute-0 nova_compute[183278]: 2026-01-21 18:20:13.240 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:13 compute-0 nova_compute[183278]: 2026-01-21 18:20:13.241 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:13 compute-0 nova_compute[183278]: 2026-01-21 18:20:13.241 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:20:14 compute-0 podman[206264]: 2026-01-21 18:20:14.001345356 +0000 UTC m=+0.053750432 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Jan 21 18:20:14 compute-0 nova_compute[183278]: 2026-01-21 18:20:14.923 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:15 compute-0 nova_compute[183278]: 2026-01-21 18:20:15.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:15 compute-0 nova_compute[183278]: 2026-01-21 18:20:15.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:15 compute-0 nova_compute[183278]: 2026-01-21 18:20:15.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:17 compute-0 nova_compute[183278]: 2026-01-21 18:20:17.994 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:19 compute-0 nova_compute[183278]: 2026-01-21 18:20:19.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:19 compute-0 nova_compute[183278]: 2026-01-21 18:20:19.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 18:20:19 compute-0 nova_compute[183278]: 2026-01-21 18:20:19.842 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 18:20:19 compute-0 nova_compute[183278]: 2026-01-21 18:20:19.925 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:20:20.077 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:20:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:20:20.077 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:20:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:20:20.077 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:20:20 compute-0 nova_compute[183278]: 2026-01-21 18:20:20.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:20:20 compute-0 podman[206286]: 2026-01-21 18:20:20.995327365 +0000 UTC m=+0.045913820 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 21 18:20:21 compute-0 podman[206285]: 2026-01-21 18:20:21.01764229 +0000 UTC m=+0.072625303 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller)
Jan 21 18:20:22 compute-0 nova_compute[183278]: 2026-01-21 18:20:22.997 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:24 compute-0 nova_compute[183278]: 2026-01-21 18:20:24.927 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:26 compute-0 podman[206327]: 2026-01-21 18:20:26.011359739 +0000 UTC m=+0.071308011 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:20:27 compute-0 nova_compute[183278]: 2026-01-21 18:20:27.999 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:29 compute-0 podman[192560]: time="2026-01-21T18:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:20:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:20:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 21 18:20:29 compute-0 nova_compute[183278]: 2026-01-21 18:20:29.929 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:31 compute-0 openstack_network_exporter[195402]: ERROR   18:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:20:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:20:31 compute-0 openstack_network_exporter[195402]: ERROR   18:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:20:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:20:31 compute-0 nova_compute[183278]: 2026-01-21 18:20:31.768 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:31 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:20:31.769 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:20:31 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:20:31.769 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:20:31 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:20:31.771 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:20:33 compute-0 nova_compute[183278]: 2026-01-21 18:20:33.000 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:34 compute-0 nova_compute[183278]: 2026-01-21 18:20:34.932 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:38 compute-0 nova_compute[183278]: 2026-01-21 18:20:38.003 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:39 compute-0 nova_compute[183278]: 2026-01-21 18:20:39.933 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:43 compute-0 nova_compute[183278]: 2026-01-21 18:20:43.005 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:44 compute-0 nova_compute[183278]: 2026-01-21 18:20:44.934 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:45 compute-0 podman[206349]: 2026-01-21 18:20:45.014290308 +0000 UTC m=+0.062254688 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 18:20:48 compute-0 nova_compute[183278]: 2026-01-21 18:20:48.007 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:49 compute-0 nova_compute[183278]: 2026-01-21 18:20:49.935 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:51 compute-0 podman[206372]: 2026-01-21 18:20:51.999423798 +0000 UTC m=+0.051412986 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 18:20:52 compute-0 podman[206371]: 2026-01-21 18:20:52.020318715 +0000 UTC m=+0.079194590 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 21 18:20:53 compute-0 nova_compute[183278]: 2026-01-21 18:20:53.009 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:54 compute-0 nova_compute[183278]: 2026-01-21 18:20:54.936 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:56 compute-0 podman[206416]: 2026-01-21 18:20:56.987583257 +0000 UTC m=+0.048348612 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:20:58 compute-0 nova_compute[183278]: 2026-01-21 18:20:58.011 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:20:59 compute-0 podman[192560]: time="2026-01-21T18:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:20:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:20:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 21 18:20:59 compute-0 nova_compute[183278]: 2026-01-21 18:20:59.939 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:01 compute-0 openstack_network_exporter[195402]: ERROR   18:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:21:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:21:01 compute-0 openstack_network_exporter[195402]: ERROR   18:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:21:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:21:03 compute-0 nova_compute[183278]: 2026-01-21 18:21:03.012 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:04 compute-0 nova_compute[183278]: 2026-01-21 18:21:04.939 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:08 compute-0 nova_compute[183278]: 2026-01-21 18:21:08.054 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:09 compute-0 nova_compute[183278]: 2026-01-21 18:21:09.942 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:11 compute-0 nova_compute[183278]: 2026-01-21 18:21:11.873 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:11 compute-0 nova_compute[183278]: 2026-01-21 18:21:11.874 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:21:11 compute-0 nova_compute[183278]: 2026-01-21 18:21:11.875 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:21:11 compute-0 nova_compute[183278]: 2026-01-21 18:21:11.891 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:21:11 compute-0 nova_compute[183278]: 2026-01-21 18:21:11.892 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:11 compute-0 nova_compute[183278]: 2026-01-21 18:21:11.912 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:11 compute-0 nova_compute[183278]: 2026-01-21 18:21:11.912 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:11 compute-0 nova_compute[183278]: 2026-01-21 18:21:11.913 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:11 compute-0 nova_compute[183278]: 2026-01-21 18:21:11.913 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.041 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.042 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5887MB free_disk=73.38159561157227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.043 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.043 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.102 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.102 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.178 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.198 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.199 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:21:12 compute-0 nova_compute[183278]: 2026-01-21 18:21:12.199 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:13 compute-0 nova_compute[183278]: 2026-01-21 18:21:13.057 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:13 compute-0 nova_compute[183278]: 2026-01-21 18:21:13.923 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "c5a6214c-5527-4115-ad3b-6320d177029b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:13 compute-0 nova_compute[183278]: 2026-01-21 18:21:13.923 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.072 183284 DEBUG nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.123 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.398 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.398 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.404 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.404 183284 INFO nova.compute.claims [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.522 183284 DEBUG nova.compute.provider_tree [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.536 183284 DEBUG nova.scheduler.client.report [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.572 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.573 183284 DEBUG nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.637 183284 DEBUG nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.637 183284 DEBUG nova.network.neutron [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.660 183284 INFO nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.674 183284 DEBUG nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.794 183284 DEBUG nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.795 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.796 183284 INFO nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Creating image(s)
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.796 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "/var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.797 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "/var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.797 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "/var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.810 183284 DEBUG nova.policy [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ab455a0326b442d986277b4d934e2b2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05d148a48e724bbaa4c36f8069f80fbd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.813 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.827 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.827 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.828 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.863 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.864 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.865 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.875 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.925 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.926 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.943 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.957 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.957 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:14 compute-0 nova_compute[183278]: 2026-01-21 18:21:14.958 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.009 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.010 183284 DEBUG nova.virt.disk.api [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Checking if we can resize image /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.010 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.062 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.063 183284 DEBUG nova.virt.disk.api [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Cannot resize image /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.064 183284 DEBUG nova.objects.instance [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lazy-loading 'migration_context' on Instance uuid c5a6214c-5527-4115-ad3b-6320d177029b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.085 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.085 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Ensure instance console log exists: /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.086 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.086 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.086 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:15 compute-0 nova_compute[183278]: 2026-01-21 18:21:15.583 183284 DEBUG nova.network.neutron [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Successfully created port: 1159582e-ade2-4ad8-9e4e-f841287d3a51 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:21:15 compute-0 podman[206457]: 2026-01-21 18:21:15.997485592 +0000 UTC m=+0.054698006 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 18:21:16 compute-0 nova_compute[183278]: 2026-01-21 18:21:16.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:16 compute-0 nova_compute[183278]: 2026-01-21 18:21:16.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:17 compute-0 nova_compute[183278]: 2026-01-21 18:21:17.754 183284 DEBUG nova.network.neutron [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Successfully updated port: 1159582e-ade2-4ad8-9e4e-f841287d3a51 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:21:17 compute-0 nova_compute[183278]: 2026-01-21 18:21:17.771 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "refresh_cache-c5a6214c-5527-4115-ad3b-6320d177029b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:21:17 compute-0 nova_compute[183278]: 2026-01-21 18:21:17.771 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquired lock "refresh_cache-c5a6214c-5527-4115-ad3b-6320d177029b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:21:17 compute-0 nova_compute[183278]: 2026-01-21 18:21:17.771 183284 DEBUG nova.network.neutron [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:21:17 compute-0 nova_compute[183278]: 2026-01-21 18:21:17.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:17 compute-0 nova_compute[183278]: 2026-01-21 18:21:17.847 183284 DEBUG nova.compute.manager [req-4a3109f1-6bc6-4bd7-8be4-06900bb74446 req-cfd63aa8-405e-47b7-ba41-9b9c63088f01 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Received event network-changed-1159582e-ade2-4ad8-9e4e-f841287d3a51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:21:17 compute-0 nova_compute[183278]: 2026-01-21 18:21:17.847 183284 DEBUG nova.compute.manager [req-4a3109f1-6bc6-4bd7-8be4-06900bb74446 req-cfd63aa8-405e-47b7-ba41-9b9c63088f01 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Refreshing instance network info cache due to event network-changed-1159582e-ade2-4ad8-9e4e-f841287d3a51. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:21:17 compute-0 nova_compute[183278]: 2026-01-21 18:21:17.847 183284 DEBUG oslo_concurrency.lockutils [req-4a3109f1-6bc6-4bd7-8be4-06900bb74446 req-cfd63aa8-405e-47b7-ba41-9b9c63088f01 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-c5a6214c-5527-4115-ad3b-6320d177029b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:21:18 compute-0 nova_compute[183278]: 2026-01-21 18:21:18.059 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:18 compute-0 nova_compute[183278]: 2026-01-21 18:21:18.071 183284 DEBUG nova.network.neutron [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:21:19 compute-0 nova_compute[183278]: 2026-01-21 18:21:19.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:21:19 compute-0 nova_compute[183278]: 2026-01-21 18:21:19.945 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:20.077 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:20.078 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:20.078 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:21 compute-0 nova_compute[183278]: 2026-01-21 18:21:21.623 183284 DEBUG nova.network.neutron [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Updating instance_info_cache with network_info: [{"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.585 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Releasing lock "refresh_cache-c5a6214c-5527-4115-ad3b-6320d177029b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.585 183284 DEBUG nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Instance network_info: |[{"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.586 183284 DEBUG oslo_concurrency.lockutils [req-4a3109f1-6bc6-4bd7-8be4-06900bb74446 req-cfd63aa8-405e-47b7-ba41-9b9c63088f01 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-c5a6214c-5527-4115-ad3b-6320d177029b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.586 183284 DEBUG nova.network.neutron [req-4a3109f1-6bc6-4bd7-8be4-06900bb74446 req-cfd63aa8-405e-47b7-ba41-9b9c63088f01 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Refreshing network info cache for port 1159582e-ade2-4ad8-9e4e-f841287d3a51 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.588 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Start _get_guest_xml network_info=[{"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.592 183284 WARNING nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.598 183284 DEBUG nova.virt.libvirt.host [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.599 183284 DEBUG nova.virt.libvirt.host [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.601 183284 DEBUG nova.virt.libvirt.host [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.602 183284 DEBUG nova.virt.libvirt.host [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.603 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.603 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.604 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.604 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.604 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.604 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.604 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.605 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.605 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.605 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.605 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.605 183284 DEBUG nova.virt.hardware [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.608 183284 DEBUG nova.virt.libvirt.vif [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:21:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1525730657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1525730657',id=10,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05d148a48e724bbaa4c36f8069f80fbd',ramdisk_id='',reservation_id='r-wgas9oiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:21:14Z,user_data=None,user_id='3ab455a0326b442d986277b4d934e2b2',uuid=c5a6214c-5527-4115-ad3b-6320d177029b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.609 183284 DEBUG nova.network.os_vif_util [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Converting VIF {"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.609 183284 DEBUG nova.network.os_vif_util [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:6e:ce,bridge_name='br-int',has_traffic_filtering=True,id=1159582e-ade2-4ad8-9e4e-f841287d3a51,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1159582e-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:21:22 compute-0 nova_compute[183278]: 2026-01-21 18:21:22.610 183284 DEBUG nova.objects.instance [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lazy-loading 'pci_devices' on Instance uuid c5a6214c-5527-4115-ad3b-6320d177029b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:21:22 compute-0 podman[206480]: 2026-01-21 18:21:22.991920077 +0000 UTC m=+0.046361004 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:21:23 compute-0 podman[206479]: 2026-01-21 18:21:23.042267617 +0000 UTC m=+0.099685166 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.061 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.084 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <uuid>c5a6214c-5527-4115-ad3b-6320d177029b</uuid>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <name>instance-0000000a</name>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1525730657</nova:name>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:21:22</nova:creationTime>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:21:23 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:21:23 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:21:23 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:21:23 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:21:23 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:21:23 compute-0 nova_compute[183278]:         <nova:user uuid="3ab455a0326b442d986277b4d934e2b2">tempest-TestExecuteHostMaintenanceStrategy-1250997236-project-member</nova:user>
Jan 21 18:21:23 compute-0 nova_compute[183278]:         <nova:project uuid="05d148a48e724bbaa4c36f8069f80fbd">tempest-TestExecuteHostMaintenanceStrategy-1250997236</nova:project>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:21:23 compute-0 nova_compute[183278]:         <nova:port uuid="1159582e-ade2-4ad8-9e4e-f841287d3a51">
Jan 21 18:21:23 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <system>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <entry name="serial">c5a6214c-5527-4115-ad3b-6320d177029b</entry>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <entry name="uuid">c5a6214c-5527-4115-ad3b-6320d177029b</entry>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     </system>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <os>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   </os>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <features>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   </features>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk.config"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:8a:6e:ce"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <target dev="tap1159582e-ad"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/console.log" append="off"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <video>
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     </video>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:21:23 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:21:23 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:21:23 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:21:23 compute-0 nova_compute[183278]: </domain>
Jan 21 18:21:23 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.084 183284 DEBUG nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Preparing to wait for external event network-vif-plugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.085 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.085 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.085 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.086 183284 DEBUG nova.virt.libvirt.vif [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:21:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1525730657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1525730657',id=10,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05d148a48e724bbaa4c36f8069f80fbd',ramdisk_id='',reservation_id='r-wgas9oiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:21:14Z,user_data=None,user_id='3ab455a0326b442d986277b4d934e2b2',uuid=c5a6214c-5527-4115-ad3b-6320d177029b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.086 183284 DEBUG nova.network.os_vif_util [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Converting VIF {"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.087 183284 DEBUG nova.network.os_vif_util [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:6e:ce,bridge_name='br-int',has_traffic_filtering=True,id=1159582e-ade2-4ad8-9e4e-f841287d3a51,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1159582e-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.087 183284 DEBUG os_vif [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:6e:ce,bridge_name='br-int',has_traffic_filtering=True,id=1159582e-ade2-4ad8-9e4e-f841287d3a51,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1159582e-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.088 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.088 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.088 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.090 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.091 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1159582e-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.091 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1159582e-ad, col_values=(('external_ids', {'iface-id': '1159582e-ade2-4ad8-9e4e-f841287d3a51', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:6e:ce', 'vm-uuid': 'c5a6214c-5527-4115-ad3b-6320d177029b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.092 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 NetworkManager[55506]: <info>  [1769019683.0933] manager: (tap1159582e-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.095 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.099 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.100 183284 INFO os_vif [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:6e:ce,bridge_name='br-int',has_traffic_filtering=True,id=1159582e-ade2-4ad8-9e4e-f841287d3a51,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1159582e-ad')
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.200 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.200 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.201 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] No VIF found with MAC fa:16:3e:8a:6e:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.201 183284 INFO nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Using config drive
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.569 183284 INFO nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Creating config drive at /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk.config
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.573 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbfw_ws9k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.694 183284 DEBUG oslo_concurrency.processutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbfw_ws9k" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:23 compute-0 kernel: tap1159582e-ad: entered promiscuous mode
Jan 21 18:21:23 compute-0 ovn_controller[95419]: 2026-01-21T18:21:23Z|00071|binding|INFO|Claiming lport 1159582e-ade2-4ad8-9e4e-f841287d3a51 for this chassis.
Jan 21 18:21:23 compute-0 ovn_controller[95419]: 2026-01-21T18:21:23Z|00072|binding|INFO|1159582e-ade2-4ad8-9e4e-f841287d3a51: Claiming fa:16:3e:8a:6e:ce 10.100.0.12
Jan 21 18:21:23 compute-0 NetworkManager[55506]: <info>  [1769019683.7586] manager: (tap1159582e-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.758 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.761 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.764 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.775 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:6e:ce 10.100.0.12'], port_security=['fa:16:3e:8a:6e:ce 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c5a6214c-5527-4115-ad3b-6320d177029b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c0bfae5-9d33-4194-a6cd-c123314635af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05d148a48e724bbaa4c36f8069f80fbd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '20b590e1-7eda-4a33-8e20-e2c79963b0dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a825335a-1d7b-43d5-8e75-4088b4c18dcd, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=1159582e-ade2-4ad8-9e4e-f841287d3a51) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.777 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 1159582e-ade2-4ad8-9e4e-f841287d3a51 in datapath 5c0bfae5-9d33-4194-a6cd-c123314635af bound to our chassis
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.778 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5c0bfae5-9d33-4194-a6cd-c123314635af
Jan 21 18:21:23 compute-0 systemd-udevd[206539]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.789 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[20ee49fe-9cc6-4a3f-a1ef-c6f620c85c62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.790 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5c0bfae5-91 in ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.792 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5c0bfae5-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.792 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[cc31c42a-8425-4522-8c42-0bf9fc48d547]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.793 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[030ec64b-4cfa-43a2-8de6-47bdd7cbf928]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 NetworkManager[55506]: <info>  [1769019683.7977] device (tap1159582e-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:21:23 compute-0 NetworkManager[55506]: <info>  [1769019683.7982] device (tap1159582e-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:21:23 compute-0 systemd-machined[154592]: New machine qemu-6-instance-0000000a.
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.804 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[945ba2d6-4bff-48fd-afa7-c2f9055db771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.841 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb6f9c6-d3d9-40b2-a67a-8460acbe1d63]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-0000000a.
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.846 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 ovn_controller[95419]: 2026-01-21T18:21:23Z|00073|binding|INFO|Setting lport 1159582e-ade2-4ad8-9e4e-f841287d3a51 ovn-installed in OVS
Jan 21 18:21:23 compute-0 ovn_controller[95419]: 2026-01-21T18:21:23Z|00074|binding|INFO|Setting lport 1159582e-ade2-4ad8-9e4e-f841287d3a51 up in Southbound
Jan 21 18:21:23 compute-0 nova_compute[183278]: 2026-01-21 18:21:23.850 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.870 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[37c7c3fb-62eb-4559-b69e-0d23ef078dbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 NetworkManager[55506]: <info>  [1769019683.8764] manager: (tap5c0bfae5-90): new Veth device (/org/freedesktop/NetworkManager/Devices/38)
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.876 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7dcd404a-91a3-4c24-a447-041aa70ce6ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.902 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[b5912b77-6926-4746-9d12-6635a0f5ecf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.905 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[ce983b21-4fb3-4127-82df-12834a10b84f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 NetworkManager[55506]: <info>  [1769019683.9228] device (tap5c0bfae5-90): carrier: link connected
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.926 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0b3e42-e82d-4341-8910-6ada755c0b7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.943 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f33e5de0-7636-4a18-8301-e9c6f88f1e91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c0bfae5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:69:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424753, 'reachable_time': 21954, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206573, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.957 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b61cdb16-d47d-4674-b913-0c6f5bff0b4d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:69dd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424753, 'tstamp': 424753}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206574, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:23 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:23.972 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0a21f7b9-1275-45c9-8590-0bc9c93ea53d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c0bfae5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:69:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424753, 'reachable_time': 21954, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 206575, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.003 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[049079f4-1971-4bfd-939a-976a076e938c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.017 183284 DEBUG nova.compute.manager [req-b6a5f595-7b09-4267-bb5a-8682ed6dbe55 req-51c0e87e-80bf-4c63-b528-94eac06216ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Received event network-vif-plugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.018 183284 DEBUG oslo_concurrency.lockutils [req-b6a5f595-7b09-4267-bb5a-8682ed6dbe55 req-51c0e87e-80bf-4c63-b528-94eac06216ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.018 183284 DEBUG oslo_concurrency.lockutils [req-b6a5f595-7b09-4267-bb5a-8682ed6dbe55 req-51c0e87e-80bf-4c63-b528-94eac06216ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.018 183284 DEBUG oslo_concurrency.lockutils [req-b6a5f595-7b09-4267-bb5a-8682ed6dbe55 req-51c0e87e-80bf-4c63-b528-94eac06216ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.018 183284 DEBUG nova.compute.manager [req-b6a5f595-7b09-4267-bb5a-8682ed6dbe55 req-51c0e87e-80bf-4c63-b528-94eac06216ab 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Processing event network-vif-plugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.060 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[51910878-9c58-41ea-8e8e-a5af757eefa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.062 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c0bfae5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.062 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.062 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c0bfae5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.064 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:24 compute-0 NetworkManager[55506]: <info>  [1769019684.0651] manager: (tap5c0bfae5-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 21 18:21:24 compute-0 kernel: tap5c0bfae5-90: entered promiscuous mode
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.069 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5c0bfae5-90, col_values=(('external_ids', {'iface-id': 'ae78296c-7244-4dd2-9b62-f5f86b1d8165'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:24 compute-0 ovn_controller[95419]: 2026-01-21T18:21:24Z|00075|binding|INFO|Releasing lport ae78296c-7244-4dd2-9b62-f5f86b1d8165 from this chassis (sb_readonly=0)
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.070 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.070 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.072 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5c0bfae5-9d33-4194-a6cd-c123314635af.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5c0bfae5-9d33-4194-a6cd-c123314635af.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.072 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e22d39f5-8c26-46f1-870f-4be809eb8c9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.073 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-5c0bfae5-9d33-4194-a6cd-c123314635af
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/5c0bfae5-9d33-4194-a6cd-c123314635af.pid.haproxy
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID 5c0bfae5-9d33-4194-a6cd-c123314635af
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:21:24 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:24.074 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'env', 'PROCESS_TAG=haproxy-5c0bfae5-9d33-4194-a6cd-c123314635af', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5c0bfae5-9d33-4194-a6cd-c123314635af.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.081 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.125 183284 DEBUG nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.126 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019684.1249485, c5a6214c-5527-4115-ad3b-6320d177029b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.126 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] VM Started (Lifecycle Event)
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.132 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.135 183284 INFO nova.virt.libvirt.driver [-] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Instance spawned successfully.
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.135 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.145 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.148 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.155 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.155 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.156 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.156 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.157 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.158 183284 DEBUG nova.virt.libvirt.driver [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.165 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.166 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019684.125967, c5a6214c-5527-4115-ad3b-6320d177029b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.166 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] VM Paused (Lifecycle Event)
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.196 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.200 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019684.1314912, c5a6214c-5527-4115-ad3b-6320d177029b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.200 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] VM Resumed (Lifecycle Event)
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.219 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.222 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.226 183284 INFO nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Took 9.43 seconds to spawn the instance on the hypervisor.
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.227 183284 DEBUG nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.250 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.280 183284 INFO nova.compute.manager [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Took 9.90 seconds to build instance.
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.300 183284 DEBUG oslo_concurrency.lockutils [None req-012a76e9-b423-4db0-b2ac-1b9be0d394ba 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:24 compute-0 podman[206613]: 2026-01-21 18:21:24.434022526 +0000 UTC m=+0.048721172 container create f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 18:21:24 compute-0 systemd[1]: Started libpod-conmon-f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91.scope.
Jan 21 18:21:24 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:21:24 compute-0 podman[206613]: 2026-01-21 18:21:24.406616132 +0000 UTC m=+0.021314798 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:21:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e3ac9cd58a11e31de43280e4c8b57708da97f81d0770ac0f892bf4fe77b9dd6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:21:24 compute-0 podman[206613]: 2026-01-21 18:21:24.519529897 +0000 UTC m=+0.134228543 container init f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:21:24 compute-0 podman[206613]: 2026-01-21 18:21:24.525639676 +0000 UTC m=+0.140338322 container start f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:21:24 compute-0 neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af[206630]: [NOTICE]   (206634) : New worker (206636) forked
Jan 21 18:21:24 compute-0 neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af[206630]: [NOTICE]   (206634) : Loading success.
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.633 183284 DEBUG nova.network.neutron [req-4a3109f1-6bc6-4bd7-8be4-06900bb74446 req-cfd63aa8-405e-47b7-ba41-9b9c63088f01 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Updated VIF entry in instance network info cache for port 1159582e-ade2-4ad8-9e4e-f841287d3a51. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.634 183284 DEBUG nova.network.neutron [req-4a3109f1-6bc6-4bd7-8be4-06900bb74446 req-cfd63aa8-405e-47b7-ba41-9b9c63088f01 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Updating instance_info_cache with network_info: [{"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.648 183284 DEBUG oslo_concurrency.lockutils [req-4a3109f1-6bc6-4bd7-8be4-06900bb74446 req-cfd63aa8-405e-47b7-ba41-9b9c63088f01 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-c5a6214c-5527-4115-ad3b-6320d177029b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:21:24 compute-0 nova_compute[183278]: 2026-01-21 18:21:24.948 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:26 compute-0 nova_compute[183278]: 2026-01-21 18:21:26.106 183284 DEBUG nova.compute.manager [req-309566c4-1d36-40ca-8f16-ed754ce6263f req-3883d449-d238-40b6-93ca-97cdeff8efb7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Received event network-vif-plugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:21:26 compute-0 nova_compute[183278]: 2026-01-21 18:21:26.107 183284 DEBUG oslo_concurrency.lockutils [req-309566c4-1d36-40ca-8f16-ed754ce6263f req-3883d449-d238-40b6-93ca-97cdeff8efb7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:26 compute-0 nova_compute[183278]: 2026-01-21 18:21:26.108 183284 DEBUG oslo_concurrency.lockutils [req-309566c4-1d36-40ca-8f16-ed754ce6263f req-3883d449-d238-40b6-93ca-97cdeff8efb7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:26 compute-0 nova_compute[183278]: 2026-01-21 18:21:26.108 183284 DEBUG oslo_concurrency.lockutils [req-309566c4-1d36-40ca-8f16-ed754ce6263f req-3883d449-d238-40b6-93ca-97cdeff8efb7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:26 compute-0 nova_compute[183278]: 2026-01-21 18:21:26.108 183284 DEBUG nova.compute.manager [req-309566c4-1d36-40ca-8f16-ed754ce6263f req-3883d449-d238-40b6-93ca-97cdeff8efb7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] No waiting events found dispatching network-vif-plugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:21:26 compute-0 nova_compute[183278]: 2026-01-21 18:21:26.109 183284 WARNING nova.compute.manager [req-309566c4-1d36-40ca-8f16-ed754ce6263f req-3883d449-d238-40b6-93ca-97cdeff8efb7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Received unexpected event network-vif-plugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 for instance with vm_state active and task_state None.
Jan 21 18:21:27 compute-0 podman[206645]: 2026-01-21 18:21:27.998440452 +0000 UTC m=+0.052023251 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:21:28 compute-0 nova_compute[183278]: 2026-01-21 18:21:28.093 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:29 compute-0 podman[192560]: time="2026-01-21T18:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:21:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:21:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Jan 21 18:21:29 compute-0 nova_compute[183278]: 2026-01-21 18:21:29.950 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:31 compute-0 openstack_network_exporter[195402]: ERROR   18:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:21:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:21:31 compute-0 openstack_network_exporter[195402]: ERROR   18:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:21:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:21:33 compute-0 nova_compute[183278]: 2026-01-21 18:21:33.095 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:34 compute-0 nova_compute[183278]: 2026-01-21 18:21:34.952 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:35 compute-0 nova_compute[183278]: 2026-01-21 18:21:35.563 183284 DEBUG nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Creating tmpfile /var/lib/nova/instances/tmpc5g9tr9v to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 21 18:21:35 compute-0 nova_compute[183278]: 2026-01-21 18:21:35.676 183284 DEBUG nova.compute.manager [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc5g9tr9v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 21 18:21:36 compute-0 ovn_controller[95419]: 2026-01-21T18:21:36Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8a:6e:ce 10.100.0.12
Jan 21 18:21:36 compute-0 ovn_controller[95419]: 2026-01-21T18:21:36Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8a:6e:ce 10.100.0.12
Jan 21 18:21:38 compute-0 nova_compute[183278]: 2026-01-21 18:21:38.097 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:38 compute-0 nova_compute[183278]: 2026-01-21 18:21:38.725 183284 DEBUG nova.compute.manager [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc5g9tr9v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ae4bd600-2611-4780-b4f9-571296621dee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 21 18:21:38 compute-0 nova_compute[183278]: 2026-01-21 18:21:38.752 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-ae4bd600-2611-4780-b4f9-571296621dee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:21:38 compute-0 nova_compute[183278]: 2026-01-21 18:21:38.752 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-ae4bd600-2611-4780-b4f9-571296621dee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:21:38 compute-0 nova_compute[183278]: 2026-01-21 18:21:38.753 183284 DEBUG nova.network.neutron [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:21:39 compute-0 nova_compute[183278]: 2026-01-21 18:21:39.954 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:41 compute-0 nova_compute[183278]: 2026-01-21 18:21:41.979 183284 DEBUG nova.network.neutron [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Updating instance_info_cache with network_info: [{"id": "2211746e-0564-4c99-81c5-664846dc9eb4", "address": "fa:16:3e:54:8f:7c", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2211746e-05", "ovs_interfaceid": "2211746e-0564-4c99-81c5-664846dc9eb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.000 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-ae4bd600-2611-4780-b4f9-571296621dee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.003 183284 DEBUG nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc5g9tr9v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ae4bd600-2611-4780-b4f9-571296621dee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.004 183284 DEBUG nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Creating instance directory: /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.004 183284 DEBUG nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Creating disk.info with the contents: {'/var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk': 'qcow2', '/var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.005 183284 DEBUG nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.005 183284 DEBUG nova.objects.instance [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ae4bd600-2611-4780-b4f9-571296621dee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.033 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.093 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.094 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.095 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.105 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.159 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.161 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.194 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.195 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.195 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.251 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.253 183284 DEBUG nova.virt.disk.api [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Checking if we can resize image /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.254 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.310 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.311 183284 DEBUG nova.virt.disk.api [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Cannot resize image /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.311 183284 DEBUG nova.objects.instance [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid ae4bd600-2611-4780-b4f9-571296621dee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.332 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.352 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk.config 485376" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.354 183284 DEBUG nova.virt.libvirt.volume.remotefs [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk.config to /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.354 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk.config /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.797 183284 DEBUG oslo_concurrency.processutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee/disk.config /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.798 183284 DEBUG nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.799 183284 DEBUG nova.virt.libvirt.vif [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:20:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2044527699',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2044527699',id=9,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:21:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='05d148a48e724bbaa4c36f8069f80fbd',ramdisk_id='',reservation_id='r-m92y57w2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:21:07Z,user_data=None,user_id='3ab455a0326b442d986277b4d934e2b2',uuid=ae4bd600-2611-4780-b4f9-571296621dee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2211746e-0564-4c99-81c5-664846dc9eb4", "address": "fa:16:3e:54:8f:7c", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2211746e-05", "ovs_interfaceid": "2211746e-0564-4c99-81c5-664846dc9eb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.800 183284 DEBUG nova.network.os_vif_util [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "2211746e-0564-4c99-81c5-664846dc9eb4", "address": "fa:16:3e:54:8f:7c", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2211746e-05", "ovs_interfaceid": "2211746e-0564-4c99-81c5-664846dc9eb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.800 183284 DEBUG nova.network.os_vif_util [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:8f:7c,bridge_name='br-int',has_traffic_filtering=True,id=2211746e-0564-4c99-81c5-664846dc9eb4,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2211746e-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.801 183284 DEBUG os_vif [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:8f:7c,bridge_name='br-int',has_traffic_filtering=True,id=2211746e-0564-4c99-81c5-664846dc9eb4,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2211746e-05') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.801 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.802 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.802 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.804 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.804 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2211746e-05, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.805 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2211746e-05, col_values=(('external_ids', {'iface-id': '2211746e-0564-4c99-81c5-664846dc9eb4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:8f:7c', 'vm-uuid': 'ae4bd600-2611-4780-b4f9-571296621dee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:42 compute-0 NetworkManager[55506]: <info>  [1769019702.8072] manager: (tap2211746e-05): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.807 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.810 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.813 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.814 183284 INFO os_vif [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:8f:7c,bridge_name='br-int',has_traffic_filtering=True,id=2211746e-0564-4c99-81c5-664846dc9eb4,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2211746e-05')
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.815 183284 DEBUG nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 21 18:21:42 compute-0 nova_compute[183278]: 2026-01-21 18:21:42.815 183284 DEBUG nova.compute.manager [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc5g9tr9v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ae4bd600-2611-4780-b4f9-571296621dee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 21 18:21:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:43.717 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:21:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:43.717 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:21:43 compute-0 nova_compute[183278]: 2026-01-21 18:21:43.753 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:44 compute-0 nova_compute[183278]: 2026-01-21 18:21:44.956 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:46 compute-0 nova_compute[183278]: 2026-01-21 18:21:46.013 183284 DEBUG nova.network.neutron [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Port 2211746e-0564-4c99-81c5-664846dc9eb4 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 21 18:21:46 compute-0 nova_compute[183278]: 2026-01-21 18:21:46.014 183284 DEBUG nova.compute.manager [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc5g9tr9v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ae4bd600-2611-4780-b4f9-571296621dee',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 21 18:21:46 compute-0 podman[206712]: 2026-01-21 18:21:46.99641873 +0000 UTC m=+0.056351647 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Jan 21 18:21:47 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 21 18:21:47 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 21 18:21:47 compute-0 kernel: tap2211746e-05: entered promiscuous mode
Jan 21 18:21:47 compute-0 NetworkManager[55506]: <info>  [1769019707.2343] manager: (tap2211746e-05): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Jan 21 18:21:47 compute-0 nova_compute[183278]: 2026-01-21 18:21:47.235 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:47 compute-0 ovn_controller[95419]: 2026-01-21T18:21:47Z|00076|binding|INFO|Claiming lport 2211746e-0564-4c99-81c5-664846dc9eb4 for this additional chassis.
Jan 21 18:21:47 compute-0 ovn_controller[95419]: 2026-01-21T18:21:47Z|00077|binding|INFO|2211746e-0564-4c99-81c5-664846dc9eb4: Claiming fa:16:3e:54:8f:7c 10.100.0.8
Jan 21 18:21:47 compute-0 ovn_controller[95419]: 2026-01-21T18:21:47Z|00078|binding|INFO|Setting lport 2211746e-0564-4c99-81c5-664846dc9eb4 ovn-installed in OVS
Jan 21 18:21:47 compute-0 nova_compute[183278]: 2026-01-21 18:21:47.249 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:47 compute-0 nova_compute[183278]: 2026-01-21 18:21:47.252 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:47 compute-0 systemd-udevd[206766]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:21:47 compute-0 systemd-machined[154592]: New machine qemu-7-instance-00000009.
Jan 21 18:21:47 compute-0 NetworkManager[55506]: <info>  [1769019707.2703] device (tap2211746e-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:21:47 compute-0 NetworkManager[55506]: <info>  [1769019707.2710] device (tap2211746e-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:21:47 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000009.
Jan 21 18:21:47 compute-0 nova_compute[183278]: 2026-01-21 18:21:47.806 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:48 compute-0 nova_compute[183278]: 2026-01-21 18:21:48.595 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019708.5947983, ae4bd600-2611-4780-b4f9-571296621dee => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:21:48 compute-0 nova_compute[183278]: 2026-01-21 18:21:48.595 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ae4bd600-2611-4780-b4f9-571296621dee] VM Started (Lifecycle Event)
Jan 21 18:21:48 compute-0 nova_compute[183278]: 2026-01-21 18:21:48.780 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:21:49 compute-0 nova_compute[183278]: 2026-01-21 18:21:49.413 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019709.4131765, ae4bd600-2611-4780-b4f9-571296621dee => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:21:49 compute-0 nova_compute[183278]: 2026-01-21 18:21:49.414 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ae4bd600-2611-4780-b4f9-571296621dee] VM Resumed (Lifecycle Event)
Jan 21 18:21:49 compute-0 nova_compute[183278]: 2026-01-21 18:21:49.431 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:21:49 compute-0 nova_compute[183278]: 2026-01-21 18:21:49.435 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:21:49 compute-0 nova_compute[183278]: 2026-01-21 18:21:49.460 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ae4bd600-2611-4780-b4f9-571296621dee] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 21 18:21:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:49.720 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:49 compute-0 nova_compute[183278]: 2026-01-21 18:21:49.958 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:50 compute-0 ovn_controller[95419]: 2026-01-21T18:21:50Z|00079|binding|INFO|Claiming lport 2211746e-0564-4c99-81c5-664846dc9eb4 for this chassis.
Jan 21 18:21:50 compute-0 ovn_controller[95419]: 2026-01-21T18:21:50Z|00080|binding|INFO|2211746e-0564-4c99-81c5-664846dc9eb4: Claiming fa:16:3e:54:8f:7c 10.100.0.8
Jan 21 18:21:50 compute-0 ovn_controller[95419]: 2026-01-21T18:21:50Z|00081|binding|INFO|Setting lport 2211746e-0564-4c99-81c5-664846dc9eb4 up in Southbound
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.549 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:8f:7c 10.100.0.8'], port_security=['fa:16:3e:54:8f:7c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ae4bd600-2611-4780-b4f9-571296621dee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c0bfae5-9d33-4194-a6cd-c123314635af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05d148a48e724bbaa4c36f8069f80fbd', 'neutron:revision_number': '12', 'neutron:security_group_ids': '20b590e1-7eda-4a33-8e20-e2c79963b0dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a825335a-1d7b-43d5-8e75-4088b4c18dcd, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=2211746e-0564-4c99-81c5-664846dc9eb4) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.550 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 2211746e-0564-4c99-81c5-664846dc9eb4 in datapath 5c0bfae5-9d33-4194-a6cd-c123314635af bound to our chassis
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.552 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5c0bfae5-9d33-4194-a6cd-c123314635af
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.572 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9e4d87fc-10cb-4e08-b8cc-d6185cfba5de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.602 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[7c2d6373-7c93-462a-a8e3-80d70b08dc52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.604 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[25c68581-db0a-4e37-88bf-1fbcf67add7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.631 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f4c352-a58b-4f69-b2d3-0b567fbddee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.647 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[c25e37f6-1393-49a1-a923-be56ca36137a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c0bfae5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:69:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424753, 'reachable_time': 21954, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206803, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.661 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[da7aaac4-3bce-49c3-aed9-7f019a9a4466]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5c0bfae5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424763, 'tstamp': 424763}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206804, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5c0bfae5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424766, 'tstamp': 424766}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206804, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.662 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c0bfae5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:50 compute-0 nova_compute[183278]: 2026-01-21 18:21:50.664 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:50 compute-0 nova_compute[183278]: 2026-01-21 18:21:50.665 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.665 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c0bfae5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.665 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.665 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5c0bfae5-90, col_values=(('external_ids', {'iface-id': 'ae78296c-7244-4dd2-9b62-f5f86b1d8165'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:21:50 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:21:50.666 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:21:50 compute-0 nova_compute[183278]: 2026-01-21 18:21:50.756 183284 INFO nova.compute.manager [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Post operation of migration started
Jan 21 18:21:51 compute-0 nova_compute[183278]: 2026-01-21 18:21:51.002 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-ae4bd600-2611-4780-b4f9-571296621dee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:21:51 compute-0 nova_compute[183278]: 2026-01-21 18:21:51.003 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-ae4bd600-2611-4780-b4f9-571296621dee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:21:51 compute-0 nova_compute[183278]: 2026-01-21 18:21:51.003 183284 DEBUG nova.network.neutron [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:21:52 compute-0 nova_compute[183278]: 2026-01-21 18:21:52.809 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:53 compute-0 nova_compute[183278]: 2026-01-21 18:21:53.011 183284 DEBUG nova.network.neutron [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Updating instance_info_cache with network_info: [{"id": "2211746e-0564-4c99-81c5-664846dc9eb4", "address": "fa:16:3e:54:8f:7c", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2211746e-05", "ovs_interfaceid": "2211746e-0564-4c99-81c5-664846dc9eb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:21:53 compute-0 nova_compute[183278]: 2026-01-21 18:21:53.037 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-ae4bd600-2611-4780-b4f9-571296621dee" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:21:53 compute-0 nova_compute[183278]: 2026-01-21 18:21:53.052 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:21:53 compute-0 nova_compute[183278]: 2026-01-21 18:21:53.052 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:21:53 compute-0 nova_compute[183278]: 2026-01-21 18:21:53.053 183284 DEBUG oslo_concurrency.lockutils [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:21:53 compute-0 nova_compute[183278]: 2026-01-21 18:21:53.056 183284 INFO nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 21 18:21:53 compute-0 virtqemud[182681]: Domain id=7 name='instance-00000009' uuid=ae4bd600-2611-4780-b4f9-571296621dee is tainted: custom-monitor
Jan 21 18:21:54 compute-0 podman[206806]: 2026-01-21 18:21:54.004489436 +0000 UTC m=+0.057627077 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:21:54 compute-0 podman[206805]: 2026-01-21 18:21:54.03482117 +0000 UTC m=+0.089426146 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:21:54 compute-0 nova_compute[183278]: 2026-01-21 18:21:54.062 183284 INFO nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 21 18:21:54 compute-0 nova_compute[183278]: 2026-01-21 18:21:54.961 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:55 compute-0 nova_compute[183278]: 2026-01-21 18:21:55.067 183284 INFO nova.virt.libvirt.driver [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 21 18:21:55 compute-0 nova_compute[183278]: 2026-01-21 18:21:55.072 183284 DEBUG nova.compute.manager [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:21:55 compute-0 nova_compute[183278]: 2026-01-21 18:21:55.127 183284 DEBUG nova.objects.instance [None req-c9553c95-9e0e-487f-914f-95188291e1ca 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 18:21:57 compute-0 nova_compute[183278]: 2026-01-21 18:21:57.810 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:21:58 compute-0 podman[206848]: 2026-01-21 18:21:58.994335387 +0000 UTC m=+0.044953851 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:21:59 compute-0 podman[192560]: time="2026-01-21T18:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:21:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:21:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Jan 21 18:21:59 compute-0 nova_compute[183278]: 2026-01-21 18:21:59.963 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.646 183284 DEBUG oslo_concurrency.lockutils [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "c5a6214c-5527-4115-ad3b-6320d177029b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.646 183284 DEBUG oslo_concurrency.lockutils [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.647 183284 DEBUG oslo_concurrency.lockutils [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.647 183284 DEBUG oslo_concurrency.lockutils [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.647 183284 DEBUG oslo_concurrency.lockutils [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.649 183284 INFO nova.compute.manager [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Terminating instance
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.650 183284 DEBUG nova.compute.manager [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:22:00 compute-0 kernel: tap1159582e-ad (unregistering): left promiscuous mode
Jan 21 18:22:00 compute-0 NetworkManager[55506]: <info>  [1769019720.6770] device (tap1159582e-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.684 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 ovn_controller[95419]: 2026-01-21T18:22:00Z|00082|binding|INFO|Releasing lport 1159582e-ade2-4ad8-9e4e-f841287d3a51 from this chassis (sb_readonly=0)
Jan 21 18:22:00 compute-0 ovn_controller[95419]: 2026-01-21T18:22:00Z|00083|binding|INFO|Setting lport 1159582e-ade2-4ad8-9e4e-f841287d3a51 down in Southbound
Jan 21 18:22:00 compute-0 ovn_controller[95419]: 2026-01-21T18:22:00Z|00084|binding|INFO|Removing iface tap1159582e-ad ovn-installed in OVS
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.686 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.696 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 21 18:22:00 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Consumed 13.525s CPU time.
Jan 21 18:22:00 compute-0 systemd-machined[154592]: Machine qemu-6-instance-0000000a terminated.
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.802 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:6e:ce 10.100.0.12'], port_security=['fa:16:3e:8a:6e:ce 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c5a6214c-5527-4115-ad3b-6320d177029b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c0bfae5-9d33-4194-a6cd-c123314635af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05d148a48e724bbaa4c36f8069f80fbd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '20b590e1-7eda-4a33-8e20-e2c79963b0dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a825335a-1d7b-43d5-8e75-4088b4c18dcd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=1159582e-ade2-4ad8-9e4e-f841287d3a51) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.803 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 1159582e-ade2-4ad8-9e4e-f841287d3a51 in datapath 5c0bfae5-9d33-4194-a6cd-c123314635af unbound from our chassis
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.804 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5c0bfae5-9d33-4194-a6cd-c123314635af
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.818 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[cf5a8218-d5da-4774-a215-09809d59b3a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.845 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[c79fed35-1dc7-4d21-bb62-dd9f7b0f7341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.848 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[31771a41-98e9-48d6-b5d7-5c6b1b06347d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:00 compute-0 NetworkManager[55506]: <info>  [1769019720.8657] manager: (tap1159582e-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 21 18:22:00 compute-0 kernel: tap1159582e-ad: entered promiscuous mode
Jan 21 18:22:00 compute-0 kernel: tap1159582e-ad (unregistering): left promiscuous mode
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.872 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.874 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[8cce41fd-ba1b-4984-ac7c-12c9b14641fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.890 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0a14c84b-b0c8-47a8-a916-5cb5613f54b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c0bfae5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:69:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424753, 'reachable_time': 21954, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206893, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.905 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b67e6557-609f-4ba3-a759-747519125635]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5c0bfae5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424763, 'tstamp': 424763}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206900, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5c0bfae5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424766, 'tstamp': 424766}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206900, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.907 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c0bfae5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.907 183284 INFO nova.virt.libvirt.driver [-] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Instance destroyed successfully.
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.907 183284 DEBUG nova.objects.instance [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lazy-loading 'resources' on Instance uuid c5a6214c-5527-4115-ad3b-6320d177029b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.908 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.912 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.913 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c0bfae5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.913 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.914 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5c0bfae5-90, col_values=(('external_ids', {'iface-id': 'ae78296c-7244-4dd2-9b62-f5f86b1d8165'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:22:00 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:00.914 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.922 183284 DEBUG nova.virt.libvirt.vif [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:21:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1525730657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1525730657',id=10,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:21:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05d148a48e724bbaa4c36f8069f80fbd',ramdisk_id='',reservation_id='r-wgas9oiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:21:24Z,user_data=None,user_id='3ab455a0326b442d986277b4d934e2b2',uuid=c5a6214c-5527-4115-ad3b-6320d177029b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.922 183284 DEBUG nova.network.os_vif_util [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Converting VIF {"id": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "address": "fa:16:3e:8a:6e:ce", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1159582e-ad", "ovs_interfaceid": "1159582e-ade2-4ad8-9e4e-f841287d3a51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.923 183284 DEBUG nova.network.os_vif_util [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:6e:ce,bridge_name='br-int',has_traffic_filtering=True,id=1159582e-ade2-4ad8-9e4e-f841287d3a51,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1159582e-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.923 183284 DEBUG os_vif [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:6e:ce,bridge_name='br-int',has_traffic_filtering=True,id=1159582e-ade2-4ad8-9e4e-f841287d3a51,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1159582e-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.925 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.925 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1159582e-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.926 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.928 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.930 183284 INFO os_vif [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:6e:ce,bridge_name='br-int',has_traffic_filtering=True,id=1159582e-ade2-4ad8-9e4e-f841287d3a51,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1159582e-ad')
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.930 183284 INFO nova.virt.libvirt.driver [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Deleting instance files /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b_del
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.931 183284 INFO nova.virt.libvirt.driver [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Deletion of /var/lib/nova/instances/c5a6214c-5527-4115-ad3b-6320d177029b_del complete
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.979 183284 INFO nova.compute.manager [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.979 183284 DEBUG oslo.service.loopingcall [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.980 183284 DEBUG nova.compute.manager [-] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:22:00 compute-0 nova_compute[183278]: 2026-01-21 18:22:00.980 183284 DEBUG nova.network.neutron [-] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:22:01 compute-0 openstack_network_exporter[195402]: ERROR   18:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:22:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:22:01 compute-0 openstack_network_exporter[195402]: ERROR   18:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:22:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:22:02 compute-0 nova_compute[183278]: 2026-01-21 18:22:02.087 183284 DEBUG nova.compute.manager [req-3b8f7489-3319-4b46-b606-64f63d31e949 req-08e52806-3172-4c78-aaea-e95a61d0336e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Received event network-vif-unplugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:22:02 compute-0 nova_compute[183278]: 2026-01-21 18:22:02.087 183284 DEBUG oslo_concurrency.lockutils [req-3b8f7489-3319-4b46-b606-64f63d31e949 req-08e52806-3172-4c78-aaea-e95a61d0336e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:02 compute-0 nova_compute[183278]: 2026-01-21 18:22:02.087 183284 DEBUG oslo_concurrency.lockutils [req-3b8f7489-3319-4b46-b606-64f63d31e949 req-08e52806-3172-4c78-aaea-e95a61d0336e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:02 compute-0 nova_compute[183278]: 2026-01-21 18:22:02.087 183284 DEBUG oslo_concurrency.lockutils [req-3b8f7489-3319-4b46-b606-64f63d31e949 req-08e52806-3172-4c78-aaea-e95a61d0336e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:02 compute-0 nova_compute[183278]: 2026-01-21 18:22:02.088 183284 DEBUG nova.compute.manager [req-3b8f7489-3319-4b46-b606-64f63d31e949 req-08e52806-3172-4c78-aaea-e95a61d0336e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] No waiting events found dispatching network-vif-unplugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:22:02 compute-0 nova_compute[183278]: 2026-01-21 18:22:02.088 183284 DEBUG nova.compute.manager [req-3b8f7489-3319-4b46-b606-64f63d31e949 req-08e52806-3172-4c78-aaea-e95a61d0336e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Received event network-vif-unplugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:22:03 compute-0 nova_compute[183278]: 2026-01-21 18:22:03.636 183284 DEBUG nova.network.neutron [-] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:22:03 compute-0 nova_compute[183278]: 2026-01-21 18:22:03.652 183284 INFO nova.compute.manager [-] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Took 2.67 seconds to deallocate network for instance.
Jan 21 18:22:03 compute-0 nova_compute[183278]: 2026-01-21 18:22:03.692 183284 DEBUG oslo_concurrency.lockutils [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:03 compute-0 nova_compute[183278]: 2026-01-21 18:22:03.693 183284 DEBUG oslo_concurrency.lockutils [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:03 compute-0 nova_compute[183278]: 2026-01-21 18:22:03.792 183284 DEBUG nova.compute.provider_tree [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:22:03 compute-0 nova_compute[183278]: 2026-01-21 18:22:03.811 183284 DEBUG nova.scheduler.client.report [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:22:03 compute-0 nova_compute[183278]: 2026-01-21 18:22:03.832 183284 DEBUG oslo_concurrency.lockutils [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:03 compute-0 nova_compute[183278]: 2026-01-21 18:22:03.856 183284 INFO nova.scheduler.client.report [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Deleted allocations for instance c5a6214c-5527-4115-ad3b-6320d177029b
Jan 21 18:22:03 compute-0 nova_compute[183278]: 2026-01-21 18:22:03.924 183284 DEBUG oslo_concurrency.lockutils [None req-941d73be-1cc9-4d16-9a21-378029e3d031 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:04 compute-0 nova_compute[183278]: 2026-01-21 18:22:04.071 183284 DEBUG nova.compute.manager [req-d7bc6947-fcfe-4b46-8de3-2913c6ceeb2b req-001b6921-db0f-4c47-bde6-f89f2690553b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Received event network-vif-plugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:22:04 compute-0 nova_compute[183278]: 2026-01-21 18:22:04.071 183284 DEBUG oslo_concurrency.lockutils [req-d7bc6947-fcfe-4b46-8de3-2913c6ceeb2b req-001b6921-db0f-4c47-bde6-f89f2690553b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:04 compute-0 nova_compute[183278]: 2026-01-21 18:22:04.071 183284 DEBUG oslo_concurrency.lockutils [req-d7bc6947-fcfe-4b46-8de3-2913c6ceeb2b req-001b6921-db0f-4c47-bde6-f89f2690553b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:04 compute-0 nova_compute[183278]: 2026-01-21 18:22:04.072 183284 DEBUG oslo_concurrency.lockutils [req-d7bc6947-fcfe-4b46-8de3-2913c6ceeb2b req-001b6921-db0f-4c47-bde6-f89f2690553b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c5a6214c-5527-4115-ad3b-6320d177029b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:04 compute-0 nova_compute[183278]: 2026-01-21 18:22:04.072 183284 DEBUG nova.compute.manager [req-d7bc6947-fcfe-4b46-8de3-2913c6ceeb2b req-001b6921-db0f-4c47-bde6-f89f2690553b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] No waiting events found dispatching network-vif-plugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:22:04 compute-0 nova_compute[183278]: 2026-01-21 18:22:04.072 183284 WARNING nova.compute.manager [req-d7bc6947-fcfe-4b46-8de3-2913c6ceeb2b req-001b6921-db0f-4c47-bde6-f89f2690553b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Received unexpected event network-vif-plugged-1159582e-ade2-4ad8-9e4e-f841287d3a51 for instance with vm_state deleted and task_state None.
Jan 21 18:22:04 compute-0 nova_compute[183278]: 2026-01-21 18:22:04.072 183284 DEBUG nova.compute.manager [req-d7bc6947-fcfe-4b46-8de3-2913c6ceeb2b req-001b6921-db0f-4c47-bde6-f89f2690553b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Received event network-vif-deleted-1159582e-ade2-4ad8-9e4e-f841287d3a51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:22:04 compute-0 nova_compute[183278]: 2026-01-21 18:22:04.965 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.270 183284 DEBUG oslo_concurrency.lockutils [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "ae4bd600-2611-4780-b4f9-571296621dee" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.271 183284 DEBUG oslo_concurrency.lockutils [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "ae4bd600-2611-4780-b4f9-571296621dee" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.271 183284 DEBUG oslo_concurrency.lockutils [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "ae4bd600-2611-4780-b4f9-571296621dee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.272 183284 DEBUG oslo_concurrency.lockutils [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "ae4bd600-2611-4780-b4f9-571296621dee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.272 183284 DEBUG oslo_concurrency.lockutils [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "ae4bd600-2611-4780-b4f9-571296621dee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.273 183284 INFO nova.compute.manager [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Terminating instance
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.275 183284 DEBUG nova.compute.manager [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:22:05 compute-0 kernel: tap2211746e-05 (unregistering): left promiscuous mode
Jan 21 18:22:05 compute-0 NetworkManager[55506]: <info>  [1769019725.2977] device (tap2211746e-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.302 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:05 compute-0 ovn_controller[95419]: 2026-01-21T18:22:05Z|00085|binding|INFO|Releasing lport 2211746e-0564-4c99-81c5-664846dc9eb4 from this chassis (sb_readonly=0)
Jan 21 18:22:05 compute-0 ovn_controller[95419]: 2026-01-21T18:22:05Z|00086|binding|INFO|Setting lport 2211746e-0564-4c99-81c5-664846dc9eb4 down in Southbound
Jan 21 18:22:05 compute-0 ovn_controller[95419]: 2026-01-21T18:22:05Z|00087|binding|INFO|Removing iface tap2211746e-05 ovn-installed in OVS
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.305 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.314 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:8f:7c 10.100.0.8'], port_security=['fa:16:3e:54:8f:7c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ae4bd600-2611-4780-b4f9-571296621dee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c0bfae5-9d33-4194-a6cd-c123314635af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05d148a48e724bbaa4c36f8069f80fbd', 'neutron:revision_number': '14', 'neutron:security_group_ids': '20b590e1-7eda-4a33-8e20-e2c79963b0dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a825335a-1d7b-43d5-8e75-4088b4c18dcd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=2211746e-0564-4c99-81c5-664846dc9eb4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.315 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 2211746e-0564-4c99-81c5-664846dc9eb4 in datapath 5c0bfae5-9d33-4194-a6cd-c123314635af unbound from our chassis
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.316 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5c0bfae5-9d33-4194-a6cd-c123314635af, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.317 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a295b716-7447-46f4-b8f7-e83e8ba69928]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.317 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af namespace which is not needed anymore
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.318 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:05 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 21 18:22:05 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000009.scope: Consumed 3.537s CPU time.
Jan 21 18:22:05 compute-0 systemd-machined[154592]: Machine qemu-7-instance-00000009 terminated.
Jan 21 18:22:05 compute-0 neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af[206630]: [NOTICE]   (206634) : haproxy version is 2.8.14-c23fe91
Jan 21 18:22:05 compute-0 neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af[206630]: [NOTICE]   (206634) : path to executable is /usr/sbin/haproxy
Jan 21 18:22:05 compute-0 neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af[206630]: [WARNING]  (206634) : Exiting Master process...
Jan 21 18:22:05 compute-0 neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af[206630]: [ALERT]    (206634) : Current worker (206636) exited with code 143 (Terminated)
Jan 21 18:22:05 compute-0 neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af[206630]: [WARNING]  (206634) : All workers exited. Exiting... (0)
Jan 21 18:22:05 compute-0 systemd[1]: libpod-f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91.scope: Deactivated successfully.
Jan 21 18:22:05 compute-0 podman[206928]: 2026-01-21 18:22:05.447565591 +0000 UTC m=+0.043947786 container died f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:22:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91-userdata-shm.mount: Deactivated successfully.
Jan 21 18:22:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-9e3ac9cd58a11e31de43280e4c8b57708da97f81d0770ac0f892bf4fe77b9dd6-merged.mount: Deactivated successfully.
Jan 21 18:22:05 compute-0 podman[206928]: 2026-01-21 18:22:05.485763226 +0000 UTC m=+0.082145411 container cleanup f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 18:22:05 compute-0 systemd[1]: libpod-conmon-f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91.scope: Deactivated successfully.
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.526 183284 INFO nova.virt.libvirt.driver [-] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Instance destroyed successfully.
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.527 183284 DEBUG nova.objects.instance [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lazy-loading 'resources' on Instance uuid ae4bd600-2611-4780-b4f9-571296621dee obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.541 183284 DEBUG nova.virt.libvirt.vif [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T18:20:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-2044527699',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-2044527699',id=9,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:21:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05d148a48e724bbaa4c36f8069f80fbd',ramdisk_id='',reservation_id='r-m92y57w2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1250997236-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:21:55Z,user_data=None,user_id='3ab455a0326b442d986277b4d934e2b2',uuid=ae4bd600-2611-4780-b4f9-571296621dee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2211746e-0564-4c99-81c5-664846dc9eb4", "address": "fa:16:3e:54:8f:7c", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2211746e-05", "ovs_interfaceid": "2211746e-0564-4c99-81c5-664846dc9eb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.542 183284 DEBUG nova.network.os_vif_util [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Converting VIF {"id": "2211746e-0564-4c99-81c5-664846dc9eb4", "address": "fa:16:3e:54:8f:7c", "network": {"id": "5c0bfae5-9d33-4194-a6cd-c123314635af", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349569805-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05d148a48e724bbaa4c36f8069f80fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2211746e-05", "ovs_interfaceid": "2211746e-0564-4c99-81c5-664846dc9eb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.542 183284 DEBUG nova.network.os_vif_util [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:8f:7c,bridge_name='br-int',has_traffic_filtering=True,id=2211746e-0564-4c99-81c5-664846dc9eb4,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2211746e-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.543 183284 DEBUG os_vif [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:8f:7c,bridge_name='br-int',has_traffic_filtering=True,id=2211746e-0564-4c99-81c5-664846dc9eb4,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2211746e-05') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.544 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.544 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2211746e-05, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.545 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.547 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.549 183284 INFO os_vif [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:8f:7c,bridge_name='br-int',has_traffic_filtering=True,id=2211746e-0564-4c99-81c5-664846dc9eb4,network=Network(5c0bfae5-9d33-4194-a6cd-c123314635af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2211746e-05')
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.549 183284 INFO nova.virt.libvirt.driver [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Deleting instance files /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee_del
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.550 183284 INFO nova.virt.libvirt.driver [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Deletion of /var/lib/nova/instances/ae4bd600-2611-4780-b4f9-571296621dee_del complete
Jan 21 18:22:05 compute-0 podman[206964]: 2026-01-21 18:22:05.557520625 +0000 UTC m=+0.050249729 container remove f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.563 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[64f05f7a-5866-435a-a112-01502f7637b2]: (4, ('Wed Jan 21 06:22:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af (f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91)\nf8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91\nWed Jan 21 06:22:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af (f8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91)\nf8d4fe423e568f35a08872a5c6b419d2f81a002c74c7600a716264705f05ad91\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.564 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a728e4-c9ea-4b75-b2cd-6c9d328ef8fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.564 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c0bfae5-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:22:05 compute-0 kernel: tap5c0bfae5-90: left promiscuous mode
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.566 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.577 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.578 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.579 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d7bb70fd-96c0-409d-950b-7980e7028f67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.588 183284 INFO nova.compute.manager [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Took 0.31 seconds to destroy the instance on the hypervisor.
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.589 183284 DEBUG oslo.service.loopingcall [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.590 183284 DEBUG nova.compute.manager [-] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.590 183284 DEBUG nova.network.neutron [-] [instance: ae4bd600-2611-4780-b4f9-571296621dee] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.593 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e806ebc6-7517-4939-8847-2659d7f94c75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.594 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[351965e5-0c90-4e31-b650-c2c56278da4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.607 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1b8cab-4dc7-45f1-aeeb-5018a7c5272f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424747, 'reachable_time': 20720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206990, 'error': None, 'target': 'ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.609 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5c0bfae5-9d33-4194-a6cd-c123314635af deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:22:05 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:05.609 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[8b7b4d93-39aa-4d5b-90ca-732f4d85c2da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:22:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d5c0bfae5\x2d9d33\x2d4194\x2da6cd\x2dc123314635af.mount: Deactivated successfully.
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.795 183284 DEBUG nova.compute.manager [req-b7429b1e-7a87-4518-8af0-4ab005ec878b req-0b68b4da-e50c-43b5-affb-f06b05537521 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Received event network-vif-unplugged-2211746e-0564-4c99-81c5-664846dc9eb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.795 183284 DEBUG oslo_concurrency.lockutils [req-b7429b1e-7a87-4518-8af0-4ab005ec878b req-0b68b4da-e50c-43b5-affb-f06b05537521 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ae4bd600-2611-4780-b4f9-571296621dee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.795 183284 DEBUG oslo_concurrency.lockutils [req-b7429b1e-7a87-4518-8af0-4ab005ec878b req-0b68b4da-e50c-43b5-affb-f06b05537521 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ae4bd600-2611-4780-b4f9-571296621dee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.795 183284 DEBUG oslo_concurrency.lockutils [req-b7429b1e-7a87-4518-8af0-4ab005ec878b req-0b68b4da-e50c-43b5-affb-f06b05537521 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ae4bd600-2611-4780-b4f9-571296621dee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.795 183284 DEBUG nova.compute.manager [req-b7429b1e-7a87-4518-8af0-4ab005ec878b req-0b68b4da-e50c-43b5-affb-f06b05537521 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] No waiting events found dispatching network-vif-unplugged-2211746e-0564-4c99-81c5-664846dc9eb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:22:05 compute-0 nova_compute[183278]: 2026-01-21 18:22:05.796 183284 DEBUG nova.compute.manager [req-b7429b1e-7a87-4518-8af0-4ab005ec878b req-0b68b4da-e50c-43b5-affb-f06b05537521 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Received event network-vif-unplugged-2211746e-0564-4c99-81c5-664846dc9eb4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:22:06 compute-0 nova_compute[183278]: 2026-01-21 18:22:06.190 183284 DEBUG nova.network.neutron [-] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:22:06 compute-0 nova_compute[183278]: 2026-01-21 18:22:06.214 183284 INFO nova.compute.manager [-] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Took 0.62 seconds to deallocate network for instance.
Jan 21 18:22:06 compute-0 nova_compute[183278]: 2026-01-21 18:22:06.270 183284 DEBUG oslo_concurrency.lockutils [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:06 compute-0 nova_compute[183278]: 2026-01-21 18:22:06.271 183284 DEBUG oslo_concurrency.lockutils [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:06 compute-0 nova_compute[183278]: 2026-01-21 18:22:06.274 183284 DEBUG oslo_concurrency.lockutils [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:06 compute-0 nova_compute[183278]: 2026-01-21 18:22:06.287 183284 DEBUG nova.compute.manager [req-74d6414e-a370-4069-ac1e-0cbb61952757 req-f5547ed3-7668-4ed0-965d-4313defc9bef 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Received event network-vif-deleted-2211746e-0564-4c99-81c5-664846dc9eb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:22:06 compute-0 nova_compute[183278]: 2026-01-21 18:22:06.305 183284 INFO nova.scheduler.client.report [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Deleted allocations for instance ae4bd600-2611-4780-b4f9-571296621dee
Jan 21 18:22:06 compute-0 nova_compute[183278]: 2026-01-21 18:22:06.378 183284 DEBUG oslo_concurrency.lockutils [None req-b4520378-7e8a-4658-9b98-4d57700b8ca4 3ab455a0326b442d986277b4d934e2b2 05d148a48e724bbaa4c36f8069f80fbd - - default default] Lock "ae4bd600-2611-4780-b4f9-571296621dee" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:07 compute-0 nova_compute[183278]: 2026-01-21 18:22:07.875 183284 DEBUG nova.compute.manager [req-eb480b30-1961-4bde-8ccf-2f9e59a5d75c req-c7fad789-4da1-4092-bfd1-e807291628b1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Received event network-vif-plugged-2211746e-0564-4c99-81c5-664846dc9eb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:22:07 compute-0 nova_compute[183278]: 2026-01-21 18:22:07.875 183284 DEBUG oslo_concurrency.lockutils [req-eb480b30-1961-4bde-8ccf-2f9e59a5d75c req-c7fad789-4da1-4092-bfd1-e807291628b1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ae4bd600-2611-4780-b4f9-571296621dee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:07 compute-0 nova_compute[183278]: 2026-01-21 18:22:07.876 183284 DEBUG oslo_concurrency.lockutils [req-eb480b30-1961-4bde-8ccf-2f9e59a5d75c req-c7fad789-4da1-4092-bfd1-e807291628b1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ae4bd600-2611-4780-b4f9-571296621dee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:07 compute-0 nova_compute[183278]: 2026-01-21 18:22:07.876 183284 DEBUG oslo_concurrency.lockutils [req-eb480b30-1961-4bde-8ccf-2f9e59a5d75c req-c7fad789-4da1-4092-bfd1-e807291628b1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ae4bd600-2611-4780-b4f9-571296621dee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:07 compute-0 nova_compute[183278]: 2026-01-21 18:22:07.876 183284 DEBUG nova.compute.manager [req-eb480b30-1961-4bde-8ccf-2f9e59a5d75c req-c7fad789-4da1-4092-bfd1-e807291628b1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] No waiting events found dispatching network-vif-plugged-2211746e-0564-4c99-81c5-664846dc9eb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:22:07 compute-0 nova_compute[183278]: 2026-01-21 18:22:07.877 183284 WARNING nova.compute.manager [req-eb480b30-1961-4bde-8ccf-2f9e59a5d75c req-c7fad789-4da1-4092-bfd1-e807291628b1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Received unexpected event network-vif-plugged-2211746e-0564-4c99-81c5-664846dc9eb4 for instance with vm_state deleted and task_state None.
Jan 21 18:22:09 compute-0 nova_compute[183278]: 2026-01-21 18:22:09.967 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:10 compute-0 nova_compute[183278]: 2026-01-21 18:22:10.546 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:11 compute-0 nova_compute[183278]: 2026-01-21 18:22:11.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:22:11 compute-0 nova_compute[183278]: 2026-01-21 18:22:11.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:22:11 compute-0 nova_compute[183278]: 2026-01-21 18:22:11.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:22:11 compute-0 nova_compute[183278]: 2026-01-21 18:22:11.913 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:22:13 compute-0 nova_compute[183278]: 2026-01-21 18:22:13.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:22:13 compute-0 nova_compute[183278]: 2026-01-21 18:22:13.949 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:13 compute-0 nova_compute[183278]: 2026-01-21 18:22:13.949 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:13 compute-0 nova_compute[183278]: 2026-01-21 18:22:13.950 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:13 compute-0 nova_compute[183278]: 2026-01-21 18:22:13.950 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.098 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.099 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5861MB free_disk=73.38157272338867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.099 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.099 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.422 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.422 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.441 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.454 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.476 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.477 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.378s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:14 compute-0 nova_compute[183278]: 2026-01-21 18:22:14.997 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:15 compute-0 nova_compute[183278]: 2026-01-21 18:22:15.547 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:15 compute-0 nova_compute[183278]: 2026-01-21 18:22:15.905 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769019720.9045742, c5a6214c-5527-4115-ad3b-6320d177029b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:22:15 compute-0 nova_compute[183278]: 2026-01-21 18:22:15.905 183284 INFO nova.compute.manager [-] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] VM Stopped (Lifecycle Event)
Jan 21 18:22:15 compute-0 nova_compute[183278]: 2026-01-21 18:22:15.924 183284 DEBUG nova.compute.manager [None req-ac9671af-7d1a-4f07-8834-2043a6e7656f - - - - - -] [instance: c5a6214c-5527-4115-ad3b-6320d177029b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:22:16 compute-0 nova_compute[183278]: 2026-01-21 18:22:16.477 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:22:16 compute-0 nova_compute[183278]: 2026-01-21 18:22:16.477 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:22:16 compute-0 nova_compute[183278]: 2026-01-21 18:22:16.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:22:16 compute-0 nova_compute[183278]: 2026-01-21 18:22:16.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:22:16 compute-0 nova_compute[183278]: 2026-01-21 18:22:16.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:22:17 compute-0 nova_compute[183278]: 2026-01-21 18:22:17.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:22:18 compute-0 podman[206992]: 2026-01-21 18:22:18.003435752 +0000 UTC m=+0.062253939 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 18:22:18 compute-0 nova_compute[183278]: 2026-01-21 18:22:18.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:22:19 compute-0 nova_compute[183278]: 2026-01-21 18:22:19.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:22:20 compute-0 nova_compute[183278]: 2026-01-21 18:22:20.053 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:20.078 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:22:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:20.079 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:22:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:20.079 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:22:20 compute-0 nova_compute[183278]: 2026-01-21 18:22:20.525 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769019725.5243764, ae4bd600-2611-4780-b4f9-571296621dee => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:22:20 compute-0 nova_compute[183278]: 2026-01-21 18:22:20.526 183284 INFO nova.compute.manager [-] [instance: ae4bd600-2611-4780-b4f9-571296621dee] VM Stopped (Lifecycle Event)
Jan 21 18:22:20 compute-0 nova_compute[183278]: 2026-01-21 18:22:20.550 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:20 compute-0 nova_compute[183278]: 2026-01-21 18:22:20.659 183284 DEBUG nova.compute.manager [None req-65b53d42-ab60-4937-9026-c75cb2b312b1 - - - - - -] [instance: ae4bd600-2611-4780-b4f9-571296621dee] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:22:24 compute-0 podman[207014]: 2026-01-21 18:22:24.99011097 +0000 UTC m=+0.046924457 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 21 18:22:25 compute-0 nova_compute[183278]: 2026-01-21 18:22:25.056 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:25 compute-0 podman[207013]: 2026-01-21 18:22:25.068346386 +0000 UTC m=+0.127457589 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 21 18:22:25 compute-0 nova_compute[183278]: 2026-01-21 18:22:25.551 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:26 compute-0 sshd-session[207054]: Invalid user eigenda from 64.227.98.100 port 51586
Jan 21 18:22:26 compute-0 sshd-session[207054]: Connection closed by invalid user eigenda 64.227.98.100 port 51586 [preauth]
Jan 21 18:22:29 compute-0 podman[192560]: time="2026-01-21T18:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:22:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:22:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Jan 21 18:22:30 compute-0 podman[207056]: 2026-01-21 18:22:30.008375457 +0000 UTC m=+0.055568096 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:22:30 compute-0 nova_compute[183278]: 2026-01-21 18:22:30.057 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:30 compute-0 nova_compute[183278]: 2026-01-21 18:22:30.553 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:31 compute-0 openstack_network_exporter[195402]: ERROR   18:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:22:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:22:31 compute-0 openstack_network_exporter[195402]: ERROR   18:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:22:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:22:34 compute-0 nova_compute[183278]: 2026-01-21 18:22:34.888 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:35 compute-0 nova_compute[183278]: 2026-01-21 18:22:35.058 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:35 compute-0 nova_compute[183278]: 2026-01-21 18:22:35.555 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:40 compute-0 nova_compute[183278]: 2026-01-21 18:22:40.059 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:40 compute-0 nova_compute[183278]: 2026-01-21 18:22:40.556 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:45 compute-0 nova_compute[183278]: 2026-01-21 18:22:45.061 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:45 compute-0 nova_compute[183278]: 2026-01-21 18:22:45.601 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:46 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:46.743 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:22:46 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:46.744 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:22:46 compute-0 nova_compute[183278]: 2026-01-21 18:22:46.749 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:48 compute-0 podman[207081]: 2026-01-21 18:22:48.996443875 +0000 UTC m=+0.056295965 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Jan 21 18:22:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:22:49.746 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:22:50 compute-0 nova_compute[183278]: 2026-01-21 18:22:50.063 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:50 compute-0 nova_compute[183278]: 2026-01-21 18:22:50.639 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:55 compute-0 nova_compute[183278]: 2026-01-21 18:22:55.064 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:55 compute-0 nova_compute[183278]: 2026-01-21 18:22:55.641 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:22:56 compute-0 podman[207103]: 2026-01-21 18:22:56.00616804 +0000 UTC m=+0.048907606 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 21 18:22:56 compute-0 podman[207102]: 2026-01-21 18:22:56.039319974 +0000 UTC m=+0.086559518 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 21 18:22:59 compute-0 podman[192560]: time="2026-01-21T18:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:22:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:22:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Jan 21 18:23:00 compute-0 nova_compute[183278]: 2026-01-21 18:23:00.066 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:00 compute-0 nova_compute[183278]: 2026-01-21 18:23:00.678 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:01 compute-0 podman[207148]: 2026-01-21 18:23:01.059244102 +0000 UTC m=+0.099141833 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:23:01 compute-0 openstack_network_exporter[195402]: ERROR   18:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:23:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:23:01 compute-0 openstack_network_exporter[195402]: ERROR   18:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:23:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:23:05 compute-0 nova_compute[183278]: 2026-01-21 18:23:05.098 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:05 compute-0 nova_compute[183278]: 2026-01-21 18:23:05.680 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:10 compute-0 nova_compute[183278]: 2026-01-21 18:23:10.098 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:10 compute-0 nova_compute[183278]: 2026-01-21 18:23:10.682 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:11 compute-0 nova_compute[183278]: 2026-01-21 18:23:11.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:11 compute-0 nova_compute[183278]: 2026-01-21 18:23:11.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:23:11 compute-0 nova_compute[183278]: 2026-01-21 18:23:11.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:23:11 compute-0 nova_compute[183278]: 2026-01-21 18:23:11.831 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:23:13 compute-0 ovn_controller[95419]: 2026-01-21T18:23:13Z|00088|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 18:23:14 compute-0 nova_compute[183278]: 2026-01-21 18:23:14.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:14 compute-0 nova_compute[183278]: 2026-01-21 18:23:14.844 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:23:14 compute-0 nova_compute[183278]: 2026-01-21 18:23:14.845 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:23:14 compute-0 nova_compute[183278]: 2026-01-21 18:23:14.845 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:23:14 compute-0 nova_compute[183278]: 2026-01-21 18:23:14.845 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:23:14 compute-0 nova_compute[183278]: 2026-01-21 18:23:14.993 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:23:14 compute-0 nova_compute[183278]: 2026-01-21 18:23:14.994 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5881MB free_disk=73.38155364990234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:23:14 compute-0 nova_compute[183278]: 2026-01-21 18:23:14.995 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:23:14 compute-0 nova_compute[183278]: 2026-01-21 18:23:14.995 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.056 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.057 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.073 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing inventories for resource provider 502e4243-611b-433d-a766-9b485d51652d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.092 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating ProviderTree inventory for provider 502e4243-611b-433d-a766-9b485d51652d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.092 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.099 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.109 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing aggregate associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.133 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing trait associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.161 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.183 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.184 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.184 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:23:15 compute-0 nova_compute[183278]: 2026-01-21 18:23:15.684 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:16 compute-0 nova_compute[183278]: 2026-01-21 18:23:16.185 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:17 compute-0 nova_compute[183278]: 2026-01-21 18:23:17.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:17 compute-0 nova_compute[183278]: 2026-01-21 18:23:17.815 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:18 compute-0 nova_compute[183278]: 2026-01-21 18:23:18.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:18 compute-0 nova_compute[183278]: 2026-01-21 18:23:18.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:18 compute-0 nova_compute[183278]: 2026-01-21 18:23:18.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:23:19 compute-0 nova_compute[183278]: 2026-01-21 18:23:19.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:19 compute-0 nova_compute[183278]: 2026-01-21 18:23:19.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:20 compute-0 podman[207173]: 2026-01-21 18:23:20.011429491 +0000 UTC m=+0.073139801 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Jan 21 18:23:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:23:20.079 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:23:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:23:20.080 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:23:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:23:20.080 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:23:20 compute-0 nova_compute[183278]: 2026-01-21 18:23:20.113 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:20 compute-0 nova_compute[183278]: 2026-01-21 18:23:20.685 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:21 compute-0 nova_compute[183278]: 2026-01-21 18:23:21.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:23:25 compute-0 nova_compute[183278]: 2026-01-21 18:23:25.114 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:25 compute-0 nova_compute[183278]: 2026-01-21 18:23:25.686 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:27 compute-0 podman[207198]: 2026-01-21 18:23:27.034431758 +0000 UTC m=+0.088659638 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 18:23:27 compute-0 podman[207197]: 2026-01-21 18:23:27.053292846 +0000 UTC m=+0.114362332 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:23:29 compute-0 podman[192560]: time="2026-01-21T18:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:23:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:23:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 21 18:23:30 compute-0 nova_compute[183278]: 2026-01-21 18:23:30.116 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:30 compute-0 nova_compute[183278]: 2026-01-21 18:23:30.687 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:31 compute-0 openstack_network_exporter[195402]: ERROR   18:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:23:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:23:31 compute-0 openstack_network_exporter[195402]: ERROR   18:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:23:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:23:31 compute-0 podman[207242]: 2026-01-21 18:23:31.996419454 +0000 UTC m=+0.049380838 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:23:35 compute-0 nova_compute[183278]: 2026-01-21 18:23:35.117 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:35 compute-0 nova_compute[183278]: 2026-01-21 18:23:35.688 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:40 compute-0 nova_compute[183278]: 2026-01-21 18:23:40.119 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:40 compute-0 nova_compute[183278]: 2026-01-21 18:23:40.690 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:45 compute-0 nova_compute[183278]: 2026-01-21 18:23:45.120 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:45 compute-0 nova_compute[183278]: 2026-01-21 18:23:45.708 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:49 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 18:23:50 compute-0 nova_compute[183278]: 2026-01-21 18:23:50.122 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:50 compute-0 nova_compute[183278]: 2026-01-21 18:23:50.710 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:50 compute-0 podman[207269]: 2026-01-21 18:23:50.993663411 +0000 UTC m=+0.054409608 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter)
Jan 21 18:23:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:23:53.773 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:23:53 compute-0 nova_compute[183278]: 2026-01-21 18:23:53.773 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:23:53.773 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:23:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:23:53.774 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:23:55 compute-0 nova_compute[183278]: 2026-01-21 18:23:55.135 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:55 compute-0 nova_compute[183278]: 2026-01-21 18:23:55.712 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:23:57 compute-0 podman[207291]: 2026-01-21 18:23:57.98744888 +0000 UTC m=+0.043004663 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 18:23:58 compute-0 podman[207290]: 2026-01-21 18:23:58.015603882 +0000 UTC m=+0.075666344 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 18:23:59 compute-0 podman[192560]: time="2026-01-21T18:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:23:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:23:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Jan 21 18:24:00 compute-0 nova_compute[183278]: 2026-01-21 18:24:00.136 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:00 compute-0 nova_compute[183278]: 2026-01-21 18:24:00.714 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:01 compute-0 openstack_network_exporter[195402]: ERROR   18:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:24:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:24:01 compute-0 openstack_network_exporter[195402]: ERROR   18:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:24:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:24:03 compute-0 podman[207333]: 2026-01-21 18:24:03.015615898 +0000 UTC m=+0.072041636 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:24:05 compute-0 nova_compute[183278]: 2026-01-21 18:24:05.137 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:05 compute-0 nova_compute[183278]: 2026-01-21 18:24:05.757 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.186 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "846c40aa-a089-4213-89d3-b56681e73e18" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.186 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.237 183284 DEBUG nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.527 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.528 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.535 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.536 183284 INFO nova.compute.claims [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.672 183284 DEBUG nova.compute.provider_tree [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.685 183284 DEBUG nova.scheduler.client.report [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.719 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.720 183284 DEBUG nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.766 183284 DEBUG nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.767 183284 DEBUG nova.network.neutron [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.788 183284 INFO nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.839 183284 DEBUG nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.933 183284 DEBUG nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.935 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.936 183284 INFO nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Creating image(s)
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.937 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "/var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.937 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.939 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.960 183284 DEBUG nova.policy [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41dc6e790bc54fbfaf5c6007d3fa5f63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:24:09 compute-0 nova_compute[183278]: 2026-01-21 18:24:09.964 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.022 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.023 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.024 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.034 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.087 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.088 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.125 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.126 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.127 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.140 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.178 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.179 183284 DEBUG nova.virt.disk.api [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Checking if we can resize image /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.179 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.236 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.238 183284 DEBUG nova.virt.disk.api [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Cannot resize image /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.238 183284 DEBUG nova.objects.instance [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'migration_context' on Instance uuid 846c40aa-a089-4213-89d3-b56681e73e18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.261 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.262 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Ensure instance console log exists: /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.262 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.262 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.263 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.759 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:10 compute-0 nova_compute[183278]: 2026-01-21 18:24:10.851 183284 DEBUG nova.network.neutron [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Successfully created port: 260f28a8-7a1b-454a-830d-2f41597334af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:24:11 compute-0 nova_compute[183278]: 2026-01-21 18:24:11.467 183284 DEBUG nova.network.neutron [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Successfully updated port: 260f28a8-7a1b-454a-830d-2f41597334af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:24:11 compute-0 nova_compute[183278]: 2026-01-21 18:24:11.485 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "refresh_cache-846c40aa-a089-4213-89d3-b56681e73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:24:11 compute-0 nova_compute[183278]: 2026-01-21 18:24:11.485 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquired lock "refresh_cache-846c40aa-a089-4213-89d3-b56681e73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:24:11 compute-0 nova_compute[183278]: 2026-01-21 18:24:11.485 183284 DEBUG nova.network.neutron [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:24:11 compute-0 nova_compute[183278]: 2026-01-21 18:24:11.568 183284 DEBUG nova.compute.manager [req-67b01a33-4855-4b9b-8a8e-dba676635b04 req-400327d9-1ae2-4c23-9dac-57db193468ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Received event network-changed-260f28a8-7a1b-454a-830d-2f41597334af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:24:11 compute-0 nova_compute[183278]: 2026-01-21 18:24:11.569 183284 DEBUG nova.compute.manager [req-67b01a33-4855-4b9b-8a8e-dba676635b04 req-400327d9-1ae2-4c23-9dac-57db193468ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Refreshing instance network info cache due to event network-changed-260f28a8-7a1b-454a-830d-2f41597334af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:24:11 compute-0 nova_compute[183278]: 2026-01-21 18:24:11.569 183284 DEBUG oslo_concurrency.lockutils [req-67b01a33-4855-4b9b-8a8e-dba676635b04 req-400327d9-1ae2-4c23-9dac-57db193468ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-846c40aa-a089-4213-89d3-b56681e73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:24:11 compute-0 nova_compute[183278]: 2026-01-21 18:24:11.618 183284 DEBUG nova.network.neutron [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.715 183284 DEBUG nova.network.neutron [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Updating instance_info_cache with network_info: [{"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.731 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Releasing lock "refresh_cache-846c40aa-a089-4213-89d3-b56681e73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.731 183284 DEBUG nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Instance network_info: |[{"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.732 183284 DEBUG oslo_concurrency.lockutils [req-67b01a33-4855-4b9b-8a8e-dba676635b04 req-400327d9-1ae2-4c23-9dac-57db193468ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-846c40aa-a089-4213-89d3-b56681e73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.732 183284 DEBUG nova.network.neutron [req-67b01a33-4855-4b9b-8a8e-dba676635b04 req-400327d9-1ae2-4c23-9dac-57db193468ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Refreshing network info cache for port 260f28a8-7a1b-454a-830d-2f41597334af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.734 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Start _get_guest_xml network_info=[{"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.739 183284 WARNING nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.748 183284 DEBUG nova.virt.libvirt.host [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.748 183284 DEBUG nova.virt.libvirt.host [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.751 183284 DEBUG nova.virt.libvirt.host [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.752 183284 DEBUG nova.virt.libvirt.host [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.753 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.753 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.753 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.754 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.754 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.754 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.754 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.754 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.755 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.755 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.755 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.755 183284 DEBUG nova.virt.hardware [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.758 183284 DEBUG nova.virt.libvirt.vif [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-148334864',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-148334864',id=12,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-j008cwlr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:24:09Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=846c40aa-a089-4213-89d3-b56681e73e18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.759 183284 DEBUG nova.network.os_vif_util [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.759 183284 DEBUG nova.network.os_vif_util [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:3c:81,bridge_name='br-int',has_traffic_filtering=True,id=260f28a8-7a1b-454a-830d-2f41597334af,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap260f28a8-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.760 183284 DEBUG nova.objects.instance [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 846c40aa-a089-4213-89d3-b56681e73e18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.778 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <uuid>846c40aa-a089-4213-89d3-b56681e73e18</uuid>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <name>instance-0000000c</name>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteStrategies-server-148334864</nova:name>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:24:13</nova:creationTime>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:24:13 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:24:13 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:24:13 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:24:13 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:24:13 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:24:13 compute-0 nova_compute[183278]:         <nova:user uuid="41dc6e790bc54fbfaf5c6007d3fa5f63">tempest-TestExecuteStrategies-1753607426-project-member</nova:user>
Jan 21 18:24:13 compute-0 nova_compute[183278]:         <nova:project uuid="fe688847145f4dee992c72dd40bbc1ac">tempest-TestExecuteStrategies-1753607426</nova:project>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:24:13 compute-0 nova_compute[183278]:         <nova:port uuid="260f28a8-7a1b-454a-830d-2f41597334af">
Jan 21 18:24:13 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <system>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <entry name="serial">846c40aa-a089-4213-89d3-b56681e73e18</entry>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <entry name="uuid">846c40aa-a089-4213-89d3-b56681e73e18</entry>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     </system>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <os>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   </os>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <features>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   </features>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk.config"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:f4:3c:81"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <target dev="tap260f28a8-7a"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/console.log" append="off"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <video>
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     </video>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:24:13 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:24:13 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:24:13 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:24:13 compute-0 nova_compute[183278]: </domain>
Jan 21 18:24:13 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.779 183284 DEBUG nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Preparing to wait for external event network-vif-plugged-260f28a8-7a1b-454a-830d-2f41597334af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.780 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "846c40aa-a089-4213-89d3-b56681e73e18-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.780 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.780 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.781 183284 DEBUG nova.virt.libvirt.vif [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-148334864',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-148334864',id=12,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-j008cwlr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:24:09Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=846c40aa-a089-4213-89d3-b56681e73e18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.781 183284 DEBUG nova.network.os_vif_util [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.782 183284 DEBUG nova.network.os_vif_util [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:3c:81,bridge_name='br-int',has_traffic_filtering=True,id=260f28a8-7a1b-454a-830d-2f41597334af,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap260f28a8-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.782 183284 DEBUG os_vif [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:3c:81,bridge_name='br-int',has_traffic_filtering=True,id=260f28a8-7a1b-454a-830d-2f41597334af,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap260f28a8-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.782 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.782 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.783 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.786 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.786 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap260f28a8-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.787 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap260f28a8-7a, col_values=(('external_ids', {'iface-id': '260f28a8-7a1b-454a-830d-2f41597334af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:3c:81', 'vm-uuid': '846c40aa-a089-4213-89d3-b56681e73e18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:13 compute-0 NetworkManager[55506]: <info>  [1769019853.7894] manager: (tap260f28a8-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.788 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.791 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.796 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.797 183284 INFO os_vif [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:3c:81,bridge_name='br-int',has_traffic_filtering=True,id=260f28a8-7a1b-454a-830d-2f41597334af,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap260f28a8-7a')
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.832 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 21 18:24:13 compute-0 nova_compute[183278]: 2026-01-21 18:24:13.832 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.192 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.193 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.193 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No VIF found with MAC fa:16:3e:f4:3c:81, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.194 183284 INFO nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Using config drive
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.848 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.848 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.849 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.849 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.910 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.970 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:14 compute-0 nova_compute[183278]: 2026-01-21 18:24:14.972 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.030 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.032 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000000c, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk.config'
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.140 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.191 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.192 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5879MB free_disk=73.38133239746094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.192 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.192 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.312 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance 846c40aa-a089-4213-89d3-b56681e73e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.312 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.312 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.352 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.366 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.383 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.383 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.780 183284 INFO nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Creating config drive at /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk.config
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.791 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl7nib4ru execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.917 183284 DEBUG oslo_concurrency.processutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl7nib4ru" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:15 compute-0 kernel: tap260f28a8-7a: entered promiscuous mode
Jan 21 18:24:15 compute-0 NetworkManager[55506]: <info>  [1769019855.9691] manager: (tap260f28a8-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Jan 21 18:24:15 compute-0 ovn_controller[95419]: 2026-01-21T18:24:15Z|00089|binding|INFO|Claiming lport 260f28a8-7a1b-454a-830d-2f41597334af for this chassis.
Jan 21 18:24:15 compute-0 ovn_controller[95419]: 2026-01-21T18:24:15Z|00090|binding|INFO|260f28a8-7a1b-454a-830d-2f41597334af: Claiming fa:16:3e:f4:3c:81 10.100.0.7
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.971 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:15 compute-0 nova_compute[183278]: 2026-01-21 18:24:15.973 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:15 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:15.983 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:3c:81 10.100.0.7'], port_security=['fa:16:3e:f4:3c:81 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '846c40aa-a089-4213-89d3-b56681e73e18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=260f28a8-7a1b-454a-830d-2f41597334af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:24:15 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:15.984 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 260f28a8-7a1b-454a-830d-2f41597334af in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf bound to our chassis
Jan 21 18:24:15 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:15.985 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:24:15 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:15.994 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[52a3a075-6c91-4fbc-9088-f7b93fe5dcaa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:15 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:15.996 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap405ec01b-71 in ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:24:15 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:15.998 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap405ec01b-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:24:15 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:15.998 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[ed912a56-8c8f-48bb-a559-1a2a120f185c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:15.999 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[012f3541-93bc-4431-b2f5-a89c250eaf76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 systemd-udevd[207400]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.009 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[95ae5ec5-c697-4a43-b245-5cacf4dc90b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 NetworkManager[55506]: <info>  [1769019856.0190] device (tap260f28a8-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:24:16 compute-0 NetworkManager[55506]: <info>  [1769019856.0197] device (tap260f28a8-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:24:16 compute-0 systemd-machined[154592]: New machine qemu-8-instance-0000000c.
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.028 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:16 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000c.
Jan 21 18:24:16 compute-0 ovn_controller[95419]: 2026-01-21T18:24:16Z|00091|binding|INFO|Setting lport 260f28a8-7a1b-454a-830d-2f41597334af ovn-installed in OVS
Jan 21 18:24:16 compute-0 ovn_controller[95419]: 2026-01-21T18:24:16Z|00092|binding|INFO|Setting lport 260f28a8-7a1b-454a-830d-2f41597334af up in Southbound
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.036 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a07e75c9-be3b-4d95-9f8c-2a3737d57be6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.038 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.062 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[f4f59df2-bd25-420d-81c9-9275589eedcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.068 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2968dfc5-b098-41ee-80b5-e0e5769800ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 NetworkManager[55506]: <info>  [1769019856.0696] manager: (tap405ec01b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Jan 21 18:24:16 compute-0 systemd-udevd[207405]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.100 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[4b599c52-2aef-41f7-b478-4af96d98c538]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.104 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[431d6e50-8e42-496d-8091-f434fbcc1303]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 NetworkManager[55506]: <info>  [1769019856.1266] device (tap405ec01b-70): carrier: link connected
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.133 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[751c5fec-57a6-4225-8953-ca56d0b96c1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.148 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[05d188b8-743f-4a5b-ac7d-fb6943f8316c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441973, 'reachable_time': 19131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207433, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.160 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d7b11c-289e-4f59-b301-4af03d58ef33]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:9502'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441973, 'tstamp': 441973}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207434, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.175 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[fa56f5e6-fdc8-4a98-9c6b-3453582ed7ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441973, 'reachable_time': 19131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 207435, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.201 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[58b3a3fe-621e-472b-890a-340cb706fd9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.254 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b9dc6fcd-b2da-434f-a120-530877cd5a6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.255 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.255 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.256 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.257 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:16 compute-0 NetworkManager[55506]: <info>  [1769019856.2580] manager: (tap405ec01b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 21 18:24:16 compute-0 kernel: tap405ec01b-70: entered promiscuous mode
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.260 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.261 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:16 compute-0 ovn_controller[95419]: 2026-01-21T18:24:16Z|00093|binding|INFO|Releasing lport 9c897ad2-8ce5-4903-8c83-1ed8f117dcdd from this chassis (sb_readonly=0)
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.271 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.272 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.273 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a41254-6308-4b7a-b5d2-4454c100dcf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.273 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:24:16 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:16.274 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'env', 'PROCESS_TAG=haproxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.338 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019856.337882, 846c40aa-a089-4213-89d3-b56681e73e18 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.343 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] VM Started (Lifecycle Event)
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.360 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.364 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019856.3380535, 846c40aa-a089-4213-89d3-b56681e73e18 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.365 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] VM Paused (Lifecycle Event)
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.382 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.385 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.402 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:24:16 compute-0 podman[207474]: 2026-01-21 18:24:16.62021197 +0000 UTC m=+0.049777348 container create 0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:24:16 compute-0 systemd[1]: Started libpod-conmon-0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9.scope.
Jan 21 18:24:16 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:24:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/596f5bf6e4fa2adfd994c03dbfd407c69aec9e5b0c491d64cbba3a1338675973/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:24:16 compute-0 podman[207474]: 2026-01-21 18:24:16.592766834 +0000 UTC m=+0.022332242 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:24:16 compute-0 podman[207474]: 2026-01-21 18:24:16.698319132 +0000 UTC m=+0.127884530 container init 0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 21 18:24:16 compute-0 podman[207474]: 2026-01-21 18:24:16.703093337 +0000 UTC m=+0.132658715 container start 0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:24:16 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[207490]: [NOTICE]   (207494) : New worker (207496) forked
Jan 21 18:24:16 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[207490]: [NOTICE]   (207494) : Loading success.
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.855 183284 DEBUG nova.compute.manager [req-767a5cab-3e97-47d5-b480-0c275af320fb req-f4a6fa0c-534e-4ac7-8c01-a8e52ac413c7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Received event network-vif-plugged-260f28a8-7a1b-454a-830d-2f41597334af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.855 183284 DEBUG oslo_concurrency.lockutils [req-767a5cab-3e97-47d5-b480-0c275af320fb req-f4a6fa0c-534e-4ac7-8c01-a8e52ac413c7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "846c40aa-a089-4213-89d3-b56681e73e18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.856 183284 DEBUG oslo_concurrency.lockutils [req-767a5cab-3e97-47d5-b480-0c275af320fb req-f4a6fa0c-534e-4ac7-8c01-a8e52ac413c7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.856 183284 DEBUG oslo_concurrency.lockutils [req-767a5cab-3e97-47d5-b480-0c275af320fb req-f4a6fa0c-534e-4ac7-8c01-a8e52ac413c7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.856 183284 DEBUG nova.compute.manager [req-767a5cab-3e97-47d5-b480-0c275af320fb req-f4a6fa0c-534e-4ac7-8c01-a8e52ac413c7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Processing event network-vif-plugged-260f28a8-7a1b-454a-830d-2f41597334af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.857 183284 DEBUG nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.861 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019856.8615742, 846c40aa-a089-4213-89d3-b56681e73e18 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.862 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] VM Resumed (Lifecycle Event)
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.864 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.868 183284 INFO nova.virt.libvirt.driver [-] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Instance spawned successfully.
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.868 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.885 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.891 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.891 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.892 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.893 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.893 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.894 183284 DEBUG nova.virt.libvirt.driver [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.903 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.932 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.967 183284 INFO nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Took 7.03 seconds to spawn the instance on the hypervisor.
Jan 21 18:24:16 compute-0 nova_compute[183278]: 2026-01-21 18:24:16.968 183284 DEBUG nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:24:17 compute-0 nova_compute[183278]: 2026-01-21 18:24:17.027 183284 INFO nova.compute.manager [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Took 7.53 seconds to build instance.
Jan 21 18:24:17 compute-0 nova_compute[183278]: 2026-01-21 18:24:17.045 183284 DEBUG oslo_concurrency.lockutils [None req-3083720e-62b0-49dc-80d6-a2d784750cd8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.858s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:17 compute-0 nova_compute[183278]: 2026-01-21 18:24:17.718 183284 DEBUG nova.network.neutron [req-67b01a33-4855-4b9b-8a8e-dba676635b04 req-400327d9-1ae2-4c23-9dac-57db193468ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Updated VIF entry in instance network info cache for port 260f28a8-7a1b-454a-830d-2f41597334af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:24:17 compute-0 nova_compute[183278]: 2026-01-21 18:24:17.719 183284 DEBUG nova.network.neutron [req-67b01a33-4855-4b9b-8a8e-dba676635b04 req-400327d9-1ae2-4c23-9dac-57db193468ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Updating instance_info_cache with network_info: [{"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:24:17 compute-0 nova_compute[183278]: 2026-01-21 18:24:17.940 183284 DEBUG oslo_concurrency.lockutils [req-67b01a33-4855-4b9b-8a8e-dba676635b04 req-400327d9-1ae2-4c23-9dac-57db193468ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-846c40aa-a089-4213-89d3-b56681e73e18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.383 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.384 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.789 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.924 183284 DEBUG nova.compute.manager [req-5def4d17-39aa-443f-8cc2-df5ece948468 req-d8eebc0b-6f49-447a-ad98-927da10d9ac2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Received event network-vif-plugged-260f28a8-7a1b-454a-830d-2f41597334af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.924 183284 DEBUG oslo_concurrency.lockutils [req-5def4d17-39aa-443f-8cc2-df5ece948468 req-d8eebc0b-6f49-447a-ad98-927da10d9ac2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "846c40aa-a089-4213-89d3-b56681e73e18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.925 183284 DEBUG oslo_concurrency.lockutils [req-5def4d17-39aa-443f-8cc2-df5ece948468 req-d8eebc0b-6f49-447a-ad98-927da10d9ac2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.925 183284 DEBUG oslo_concurrency.lockutils [req-5def4d17-39aa-443f-8cc2-df5ece948468 req-d8eebc0b-6f49-447a-ad98-927da10d9ac2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.926 183284 DEBUG nova.compute.manager [req-5def4d17-39aa-443f-8cc2-df5ece948468 req-d8eebc0b-6f49-447a-ad98-927da10d9ac2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] No waiting events found dispatching network-vif-plugged-260f28a8-7a1b-454a-830d-2f41597334af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:24:18 compute-0 nova_compute[183278]: 2026-01-21 18:24:18.926 183284 WARNING nova.compute.manager [req-5def4d17-39aa-443f-8cc2-df5ece948468 req-d8eebc0b-6f49-447a-ad98-927da10d9ac2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Received unexpected event network-vif-plugged-260f28a8-7a1b-454a-830d-2f41597334af for instance with vm_state active and task_state None.
Jan 21 18:24:19 compute-0 nova_compute[183278]: 2026-01-21 18:24:19.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:24:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:20.080 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:20.081 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:20.082 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:20 compute-0 nova_compute[183278]: 2026-01-21 18:24:20.162 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:21 compute-0 nova_compute[183278]: 2026-01-21 18:24:21.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:24:22 compute-0 podman[207505]: 2026-01-21 18:24:22.002921637 +0000 UTC m=+0.056636752 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 21 18:24:23 compute-0 nova_compute[183278]: 2026-01-21 18:24:23.792 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:25 compute-0 nova_compute[183278]: 2026-01-21 18:24:25.164 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:28 compute-0 nova_compute[183278]: 2026-01-21 18:24:28.794 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:28 compute-0 podman[207542]: 2026-01-21 18:24:28.999455564 +0000 UTC m=+0.049663124 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:24:29 compute-0 podman[207541]: 2026-01-21 18:24:29.022163195 +0000 UTC m=+0.076572437 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 18:24:29 compute-0 ovn_controller[95419]: 2026-01-21T18:24:29Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:3c:81 10.100.0.7
Jan 21 18:24:29 compute-0 ovn_controller[95419]: 2026-01-21T18:24:29Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:3c:81 10.100.0.7
Jan 21 18:24:29 compute-0 podman[192560]: time="2026-01-21T18:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:24:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:24:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2629 "" "Go-http-client/1.1"
Jan 21 18:24:30 compute-0 nova_compute[183278]: 2026-01-21 18:24:30.166 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:31 compute-0 openstack_network_exporter[195402]: ERROR   18:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:24:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:24:31 compute-0 openstack_network_exporter[195402]: ERROR   18:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:24:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:24:33 compute-0 nova_compute[183278]: 2026-01-21 18:24:33.269 183284 DEBUG nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Creating tmpfile /var/lib/nova/instances/tmp56pd7u8c to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 21 18:24:33 compute-0 nova_compute[183278]: 2026-01-21 18:24:33.269 183284 DEBUG nova.compute.manager [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp56pd7u8c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 21 18:24:33 compute-0 nova_compute[183278]: 2026-01-21 18:24:33.795 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:33 compute-0 podman[207585]: 2026-01-21 18:24:33.984406886 +0000 UTC m=+0.044520720 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:24:34 compute-0 nova_compute[183278]: 2026-01-21 18:24:34.139 183284 DEBUG nova.compute.manager [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp56pd7u8c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='834c4219-2ee2-47cb-aa62-2ed8545ea4e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 21 18:24:34 compute-0 nova_compute[183278]: 2026-01-21 18:24:34.162 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-834c4219-2ee2-47cb-aa62-2ed8545ea4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:24:34 compute-0 nova_compute[183278]: 2026-01-21 18:24:34.163 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-834c4219-2ee2-47cb-aa62-2ed8545ea4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:24:34 compute-0 nova_compute[183278]: 2026-01-21 18:24:34.163 183284 DEBUG nova.network.neutron [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.168 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.741 183284 DEBUG nova.network.neutron [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Updating instance_info_cache with network_info: [{"id": "1dad2357-2038-47d2-9787-f54ea149b6c6", "address": "fa:16:3e:57:c0:6f", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dad2357-20", "ovs_interfaceid": "1dad2357-2038-47d2-9787-f54ea149b6c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.765 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-834c4219-2ee2-47cb-aa62-2ed8545ea4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.766 183284 DEBUG nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp56pd7u8c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='834c4219-2ee2-47cb-aa62-2ed8545ea4e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.767 183284 DEBUG nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Creating instance directory: /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.767 183284 DEBUG nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Creating disk.info with the contents: {'/var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk': 'qcow2', '/var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.767 183284 DEBUG nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.768 183284 DEBUG nova.objects.instance [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 834c4219-2ee2-47cb-aa62-2ed8545ea4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.793 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.851 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.853 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.853 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.867 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.922 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.923 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.959 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.960 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:35 compute-0 nova_compute[183278]: 2026-01-21 18:24:35.960 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.010 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.011 183284 DEBUG nova.virt.disk.api [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Checking if we can resize image /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.011 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.063 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.064 183284 DEBUG nova.virt.disk.api [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Cannot resize image /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.064 183284 DEBUG nova.objects.instance [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid 834c4219-2ee2-47cb-aa62-2ed8545ea4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.080 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.101 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk.config 485376" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.102 183284 DEBUG nova.virt.libvirt.volume.remotefs [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk.config to /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.103 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk.config /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.510 183284 DEBUG oslo_concurrency.processutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6/disk.config /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.511 183284 DEBUG nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.512 183284 DEBUG nova.virt.libvirt.vif [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:23:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-706167577',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-706167577',id=11,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:23:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-e6n0hcdh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:23:58Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=834c4219-2ee2-47cb-aa62-2ed8545ea4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1dad2357-2038-47d2-9787-f54ea149b6c6", "address": "fa:16:3e:57:c0:6f", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1dad2357-20", "ovs_interfaceid": "1dad2357-2038-47d2-9787-f54ea149b6c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.513 183284 DEBUG nova.network.os_vif_util [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "1dad2357-2038-47d2-9787-f54ea149b6c6", "address": "fa:16:3e:57:c0:6f", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1dad2357-20", "ovs_interfaceid": "1dad2357-2038-47d2-9787-f54ea149b6c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.513 183284 DEBUG nova.network.os_vif_util [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:c0:6f,bridge_name='br-int',has_traffic_filtering=True,id=1dad2357-2038-47d2-9787-f54ea149b6c6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dad2357-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.514 183284 DEBUG os_vif [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:c0:6f,bridge_name='br-int',has_traffic_filtering=True,id=1dad2357-2038-47d2-9787-f54ea149b6c6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dad2357-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.514 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.515 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.515 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.517 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.518 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1dad2357-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.518 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1dad2357-20, col_values=(('external_ids', {'iface-id': '1dad2357-2038-47d2-9787-f54ea149b6c6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:c0:6f', 'vm-uuid': '834c4219-2ee2-47cb-aa62-2ed8545ea4e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.520 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:36 compute-0 NetworkManager[55506]: <info>  [1769019876.5208] manager: (tap1dad2357-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.522 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.526 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.527 183284 INFO os_vif [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:c0:6f,bridge_name='br-int',has_traffic_filtering=True,id=1dad2357-2038-47d2-9787-f54ea149b6c6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dad2357-20')
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.527 183284 DEBUG nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 21 18:24:36 compute-0 nova_compute[183278]: 2026-01-21 18:24:36.528 183284 DEBUG nova.compute.manager [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp56pd7u8c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='834c4219-2ee2-47cb-aa62-2ed8545ea4e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 21 18:24:37 compute-0 sshd-session[207632]: Invalid user anvel from 64.227.98.100 port 53470
Jan 21 18:24:37 compute-0 sshd-session[207632]: Connection closed by invalid user anvel 64.227.98.100 port 53470 [preauth]
Jan 21 18:24:37 compute-0 nova_compute[183278]: 2026-01-21 18:24:37.720 183284 DEBUG nova.network.neutron [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Port 1dad2357-2038-47d2-9787-f54ea149b6c6 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 21 18:24:37 compute-0 nova_compute[183278]: 2026-01-21 18:24:37.722 183284 DEBUG nova.compute.manager [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp56pd7u8c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='834c4219-2ee2-47cb-aa62-2ed8545ea4e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 21 18:24:37 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 21 18:24:37 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 21 18:24:37 compute-0 kernel: tap1dad2357-20: entered promiscuous mode
Jan 21 18:24:37 compute-0 NetworkManager[55506]: <info>  [1769019877.9948] manager: (tap1dad2357-20): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 21 18:24:37 compute-0 ovn_controller[95419]: 2026-01-21T18:24:37Z|00094|binding|INFO|Claiming lport 1dad2357-2038-47d2-9787-f54ea149b6c6 for this additional chassis.
Jan 21 18:24:37 compute-0 ovn_controller[95419]: 2026-01-21T18:24:37Z|00095|binding|INFO|1dad2357-2038-47d2-9787-f54ea149b6c6: Claiming fa:16:3e:57:c0:6f 10.100.0.8
Jan 21 18:24:37 compute-0 nova_compute[183278]: 2026-01-21 18:24:37.995 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:38 compute-0 ovn_controller[95419]: 2026-01-21T18:24:38Z|00096|binding|INFO|Setting lport 1dad2357-2038-47d2-9787-f54ea149b6c6 ovn-installed in OVS
Jan 21 18:24:38 compute-0 nova_compute[183278]: 2026-01-21 18:24:38.010 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:38 compute-0 nova_compute[183278]: 2026-01-21 18:24:38.013 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:38 compute-0 systemd-udevd[207668]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:24:38 compute-0 systemd-machined[154592]: New machine qemu-9-instance-0000000b.
Jan 21 18:24:38 compute-0 NetworkManager[55506]: <info>  [1769019878.0367] device (tap1dad2357-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:24:38 compute-0 NetworkManager[55506]: <info>  [1769019878.0377] device (tap1dad2357-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:24:38 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000b.
Jan 21 18:24:38 compute-0 nova_compute[183278]: 2026-01-21 18:24:38.573 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019878.5732932, 834c4219-2ee2-47cb-aa62-2ed8545ea4e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:24:38 compute-0 nova_compute[183278]: 2026-01-21 18:24:38.574 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] VM Started (Lifecycle Event)
Jan 21 18:24:38 compute-0 nova_compute[183278]: 2026-01-21 18:24:38.598 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:24:39 compute-0 nova_compute[183278]: 2026-01-21 18:24:39.387 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019879.387564, 834c4219-2ee2-47cb-aa62-2ed8545ea4e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:24:39 compute-0 nova_compute[183278]: 2026-01-21 18:24:39.388 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] VM Resumed (Lifecycle Event)
Jan 21 18:24:39 compute-0 nova_compute[183278]: 2026-01-21 18:24:39.412 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:24:39 compute-0 nova_compute[183278]: 2026-01-21 18:24:39.415 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:24:39 compute-0 nova_compute[183278]: 2026-01-21 18:24:39.433 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 21 18:24:40 compute-0 nova_compute[183278]: 2026-01-21 18:24:40.173 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:40 compute-0 ovn_controller[95419]: 2026-01-21T18:24:40Z|00097|binding|INFO|Claiming lport 1dad2357-2038-47d2-9787-f54ea149b6c6 for this chassis.
Jan 21 18:24:40 compute-0 ovn_controller[95419]: 2026-01-21T18:24:40Z|00098|binding|INFO|1dad2357-2038-47d2-9787-f54ea149b6c6: Claiming fa:16:3e:57:c0:6f 10.100.0.8
Jan 21 18:24:40 compute-0 ovn_controller[95419]: 2026-01-21T18:24:40Z|00099|binding|INFO|Setting lport 1dad2357-2038-47d2-9787-f54ea149b6c6 up in Southbound
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.758 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:c0:6f 10.100.0.8'], port_security=['fa:16:3e:57:c0:6f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '834c4219-2ee2-47cb-aa62-2ed8545ea4e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '11', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=1dad2357-2038-47d2-9787-f54ea149b6c6) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.759 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 1dad2357-2038-47d2-9787-f54ea149b6c6 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf bound to our chassis
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.761 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.774 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[778a93b5-b188-4b8f-b7a8-9ce966c28981]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.802 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[41569f5a-2f26-4631-a514-dedcc0b81de3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.805 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[3879ffa5-ea01-4296-89dd-b6d0d9b31dce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.830 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[c4624781-b89a-4bfc-80af-1462dc1e86d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.845 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf60724-916e-40bc-a8b4-a0edee1ac4e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441973, 'reachable_time': 19131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207700, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.860 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d2995447-c71b-49b6-ad63-b4f85601392f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap405ec01b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441983, 'tstamp': 441983}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207701, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap405ec01b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441985, 'tstamp': 441985}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207701, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.862 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:40 compute-0 nova_compute[183278]: 2026-01-21 18:24:40.864 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.866 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.867 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.867 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:40.867 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:24:40 compute-0 nova_compute[183278]: 2026-01-21 18:24:40.885 183284 INFO nova.compute.manager [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Post operation of migration started
Jan 21 18:24:41 compute-0 nova_compute[183278]: 2026-01-21 18:24:41.111 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-834c4219-2ee2-47cb-aa62-2ed8545ea4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:24:41 compute-0 nova_compute[183278]: 2026-01-21 18:24:41.111 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-834c4219-2ee2-47cb-aa62-2ed8545ea4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:24:41 compute-0 nova_compute[183278]: 2026-01-21 18:24:41.112 183284 DEBUG nova.network.neutron [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:24:41 compute-0 nova_compute[183278]: 2026-01-21 18:24:41.521 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:42 compute-0 nova_compute[183278]: 2026-01-21 18:24:42.774 183284 DEBUG nova.network.neutron [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Updating instance_info_cache with network_info: [{"id": "1dad2357-2038-47d2-9787-f54ea149b6c6", "address": "fa:16:3e:57:c0:6f", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dad2357-20", "ovs_interfaceid": "1dad2357-2038-47d2-9787-f54ea149b6c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:24:42 compute-0 nova_compute[183278]: 2026-01-21 18:24:42.790 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-834c4219-2ee2-47cb-aa62-2ed8545ea4e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:24:42 compute-0 nova_compute[183278]: 2026-01-21 18:24:42.807 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:42 compute-0 nova_compute[183278]: 2026-01-21 18:24:42.808 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:42 compute-0 nova_compute[183278]: 2026-01-21 18:24:42.808 183284 DEBUG oslo_concurrency.lockutils [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:42 compute-0 nova_compute[183278]: 2026-01-21 18:24:42.812 183284 INFO nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 21 18:24:42 compute-0 virtqemud[182681]: Domain id=9 name='instance-0000000b' uuid=834c4219-2ee2-47cb-aa62-2ed8545ea4e6 is tainted: custom-monitor
Jan 21 18:24:43 compute-0 nova_compute[183278]: 2026-01-21 18:24:43.821 183284 INFO nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 21 18:24:44 compute-0 nova_compute[183278]: 2026-01-21 18:24:44.826 183284 INFO nova.virt.libvirt.driver [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 21 18:24:44 compute-0 nova_compute[183278]: 2026-01-21 18:24:44.830 183284 DEBUG nova.compute.manager [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:24:44 compute-0 nova_compute[183278]: 2026-01-21 18:24:44.852 183284 DEBUG nova.objects.instance [None req-d766fb60-47b0-4d87-980f-8768bc564e3f 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 18:24:45 compute-0 nova_compute[183278]: 2026-01-21 18:24:45.175 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:46 compute-0 nova_compute[183278]: 2026-01-21 18:24:46.523 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.527 183284 DEBUG oslo_concurrency.lockutils [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "846c40aa-a089-4213-89d3-b56681e73e18" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.528 183284 DEBUG oslo_concurrency.lockutils [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.528 183284 DEBUG oslo_concurrency.lockutils [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "846c40aa-a089-4213-89d3-b56681e73e18-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.528 183284 DEBUG oslo_concurrency.lockutils [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.528 183284 DEBUG oslo_concurrency.lockutils [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.529 183284 INFO nova.compute.manager [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Terminating instance
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.530 183284 DEBUG nova.compute.manager [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:24:49 compute-0 kernel: tap260f28a8-7a (unregistering): left promiscuous mode
Jan 21 18:24:49 compute-0 NetworkManager[55506]: <info>  [1769019889.5519] device (tap260f28a8-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:24:49 compute-0 ovn_controller[95419]: 2026-01-21T18:24:49Z|00100|binding|INFO|Releasing lport 260f28a8-7a1b-454a-830d-2f41597334af from this chassis (sb_readonly=0)
Jan 21 18:24:49 compute-0 ovn_controller[95419]: 2026-01-21T18:24:49Z|00101|binding|INFO|Setting lport 260f28a8-7a1b-454a-830d-2f41597334af down in Southbound
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.558 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:49 compute-0 ovn_controller[95419]: 2026-01-21T18:24:49Z|00102|binding|INFO|Removing iface tap260f28a8-7a ovn-installed in OVS
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.560 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.565 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:3c:81 10.100.0.7'], port_security=['fa:16:3e:f4:3c:81 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '846c40aa-a089-4213-89d3-b56681e73e18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=260f28a8-7a1b-454a-830d-2f41597334af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.567 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 260f28a8-7a1b-454a-830d-2f41597334af in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf unbound from our chassis
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.569 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.572 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.586 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2c384549-1f05-4aa7-b517-ebe08a1790a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:49 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 21 18:24:49 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Consumed 13.552s CPU time.
Jan 21 18:24:49 compute-0 systemd-machined[154592]: Machine qemu-8-instance-0000000c terminated.
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.610 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[aaba269b-6a43-41fd-9767-104029c1a6f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.612 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[8a20ac2a-55b9-46a9-bd83-75acca890ec8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.635 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[b34b4fa7-4f44-4f6a-878a-00a8aa2dcbf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.649 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9839d2-f01b-4f21-8291-c1b6e84abb5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441973, 'reachable_time': 19131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207713, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.664 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0ebca27b-5ca3-4039-8a93-ab3a7d8bd8e5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap405ec01b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441983, 'tstamp': 441983}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207714, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap405ec01b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441985, 'tstamp': 441985}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207714, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.666 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.667 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.670 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.671 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.671 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.672 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:49.672 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.791 183284 INFO nova.virt.libvirt.driver [-] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Instance destroyed successfully.
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.792 183284 DEBUG nova.objects.instance [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'resources' on Instance uuid 846c40aa-a089-4213-89d3-b56681e73e18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.805 183284 DEBUG nova.virt.libvirt.vif [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:24:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-148334864',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-148334864',id=12,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:24:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-j008cwlr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:24:17Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=846c40aa-a089-4213-89d3-b56681e73e18,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.805 183284 DEBUG nova.network.os_vif_util [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "260f28a8-7a1b-454a-830d-2f41597334af", "address": "fa:16:3e:f4:3c:81", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap260f28a8-7a", "ovs_interfaceid": "260f28a8-7a1b-454a-830d-2f41597334af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.806 183284 DEBUG nova.network.os_vif_util [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:3c:81,bridge_name='br-int',has_traffic_filtering=True,id=260f28a8-7a1b-454a-830d-2f41597334af,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap260f28a8-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.806 183284 DEBUG os_vif [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:3c:81,bridge_name='br-int',has_traffic_filtering=True,id=260f28a8-7a1b-454a-830d-2f41597334af,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap260f28a8-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.807 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.808 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap260f28a8-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.809 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.810 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.811 183284 INFO os_vif [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:3c:81,bridge_name='br-int',has_traffic_filtering=True,id=260f28a8-7a1b-454a-830d-2f41597334af,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap260f28a8-7a')
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.812 183284 INFO nova.virt.libvirt.driver [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Deleting instance files /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18_del
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.812 183284 INFO nova.virt.libvirt.driver [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Deletion of /var/lib/nova/instances/846c40aa-a089-4213-89d3-b56681e73e18_del complete
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.857 183284 INFO nova.compute.manager [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.857 183284 DEBUG oslo.service.loopingcall [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.857 183284 DEBUG nova.compute.manager [-] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.858 183284 DEBUG nova.network.neutron [-] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.985 183284 DEBUG nova.compute.manager [req-b83eb4ad-804d-4ec5-8d89-6e60eb2e4ce8 req-b595cfe0-54fe-4fff-8802-2734c7a54c26 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Received event network-vif-unplugged-260f28a8-7a1b-454a-830d-2f41597334af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.985 183284 DEBUG oslo_concurrency.lockutils [req-b83eb4ad-804d-4ec5-8d89-6e60eb2e4ce8 req-b595cfe0-54fe-4fff-8802-2734c7a54c26 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "846c40aa-a089-4213-89d3-b56681e73e18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.985 183284 DEBUG oslo_concurrency.lockutils [req-b83eb4ad-804d-4ec5-8d89-6e60eb2e4ce8 req-b595cfe0-54fe-4fff-8802-2734c7a54c26 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.986 183284 DEBUG oslo_concurrency.lockutils [req-b83eb4ad-804d-4ec5-8d89-6e60eb2e4ce8 req-b595cfe0-54fe-4fff-8802-2734c7a54c26 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.986 183284 DEBUG nova.compute.manager [req-b83eb4ad-804d-4ec5-8d89-6e60eb2e4ce8 req-b595cfe0-54fe-4fff-8802-2734c7a54c26 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] No waiting events found dispatching network-vif-unplugged-260f28a8-7a1b-454a-830d-2f41597334af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:24:49 compute-0 nova_compute[183278]: 2026-01-21 18:24:49.986 183284 DEBUG nova.compute.manager [req-b83eb4ad-804d-4ec5-8d89-6e60eb2e4ce8 req-b595cfe0-54fe-4fff-8802-2734c7a54c26 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Received event network-vif-unplugged-260f28a8-7a1b-454a-830d-2f41597334af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.177 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.381 183284 DEBUG nova.network.neutron [-] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.401 183284 INFO nova.compute.manager [-] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Took 0.54 seconds to deallocate network for instance.
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.470 183284 DEBUG oslo_concurrency.lockutils [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.471 183284 DEBUG oslo_concurrency.lockutils [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.536 183284 DEBUG nova.compute.provider_tree [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.553 183284 DEBUG nova.scheduler.client.report [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.579 183284 DEBUG oslo_concurrency.lockutils [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.602 183284 INFO nova.scheduler.client.report [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Deleted allocations for instance 846c40aa-a089-4213-89d3-b56681e73e18
Jan 21 18:24:50 compute-0 nova_compute[183278]: 2026-01-21 18:24:50.673 183284 DEBUG oslo_concurrency.lockutils [None req-b79fe929-c6a2-4278-9b8b-a512c40a2440 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.288 183284 DEBUG oslo_concurrency.lockutils [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.289 183284 DEBUG oslo_concurrency.lockutils [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.289 183284 DEBUG oslo_concurrency.lockutils [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.289 183284 DEBUG oslo_concurrency.lockutils [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.290 183284 DEBUG oslo_concurrency.lockutils [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.291 183284 INFO nova.compute.manager [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Terminating instance
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.292 183284 DEBUG nova.compute.manager [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:24:51 compute-0 kernel: tap1dad2357-20 (unregistering): left promiscuous mode
Jan 21 18:24:51 compute-0 NetworkManager[55506]: <info>  [1769019891.3132] device (tap1dad2357-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:24:51 compute-0 ovn_controller[95419]: 2026-01-21T18:24:51Z|00103|binding|INFO|Releasing lport 1dad2357-2038-47d2-9787-f54ea149b6c6 from this chassis (sb_readonly=0)
Jan 21 18:24:51 compute-0 ovn_controller[95419]: 2026-01-21T18:24:51Z|00104|binding|INFO|Setting lport 1dad2357-2038-47d2-9787-f54ea149b6c6 down in Southbound
Jan 21 18:24:51 compute-0 ovn_controller[95419]: 2026-01-21T18:24:51Z|00105|binding|INFO|Removing iface tap1dad2357-20 ovn-installed in OVS
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.316 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.318 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.324 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:c0:6f 10.100.0.8'], port_security=['fa:16:3e:57:c0:6f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '834c4219-2ee2-47cb-aa62-2ed8545ea4e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '13', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=1dad2357-2038-47d2-9787-f54ea149b6c6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.325 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 1dad2357-2038-47d2-9787-f54ea149b6c6 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf unbound from our chassis
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.326 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.327 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0c90304e-f118-46c7-920d-b0fa17fe2a9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.327 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace which is not needed anymore
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.333 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:51 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 21 18:24:51 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Consumed 1.494s CPU time.
Jan 21 18:24:51 compute-0 systemd-machined[154592]: Machine qemu-9-instance-0000000b terminated.
Jan 21 18:24:51 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[207490]: [NOTICE]   (207494) : haproxy version is 2.8.14-c23fe91
Jan 21 18:24:51 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[207490]: [NOTICE]   (207494) : path to executable is /usr/sbin/haproxy
Jan 21 18:24:51 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[207490]: [WARNING]  (207494) : Exiting Master process...
Jan 21 18:24:51 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[207490]: [ALERT]    (207494) : Current worker (207496) exited with code 143 (Terminated)
Jan 21 18:24:51 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[207490]: [WARNING]  (207494) : All workers exited. Exiting... (0)
Jan 21 18:24:51 compute-0 systemd[1]: libpod-0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9.scope: Deactivated successfully.
Jan 21 18:24:51 compute-0 podman[207756]: 2026-01-21 18:24:51.472052256 +0000 UTC m=+0.046083436 container died 0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:24:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9-userdata-shm.mount: Deactivated successfully.
Jan 21 18:24:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-596f5bf6e4fa2adfd994c03dbfd407c69aec9e5b0c491d64cbba3a1338675973-merged.mount: Deactivated successfully.
Jan 21 18:24:51 compute-0 NetworkManager[55506]: <info>  [1769019891.5102] manager: (tap1dad2357-20): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Jan 21 18:24:51 compute-0 podman[207756]: 2026-01-21 18:24:51.51523498 +0000 UTC m=+0.089266170 container cleanup 0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 18:24:51 compute-0 systemd[1]: libpod-conmon-0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9.scope: Deactivated successfully.
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.554 183284 INFO nova.virt.libvirt.driver [-] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Instance destroyed successfully.
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.554 183284 DEBUG nova.objects.instance [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'resources' on Instance uuid 834c4219-2ee2-47cb-aa62-2ed8545ea4e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.585 183284 DEBUG nova.virt.libvirt.vif [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T18:23:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-706167577',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-706167577',id=11,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:23:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-e6n0hcdh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:24:44Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=834c4219-2ee2-47cb-aa62-2ed8545ea4e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1dad2357-2038-47d2-9787-f54ea149b6c6", "address": "fa:16:3e:57:c0:6f", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dad2357-20", "ovs_interfaceid": "1dad2357-2038-47d2-9787-f54ea149b6c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.586 183284 DEBUG nova.network.os_vif_util [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "1dad2357-2038-47d2-9787-f54ea149b6c6", "address": "fa:16:3e:57:c0:6f", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1dad2357-20", "ovs_interfaceid": "1dad2357-2038-47d2-9787-f54ea149b6c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.587 183284 DEBUG nova.network.os_vif_util [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:c0:6f,bridge_name='br-int',has_traffic_filtering=True,id=1dad2357-2038-47d2-9787-f54ea149b6c6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dad2357-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.587 183284 DEBUG os_vif [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:c0:6f,bridge_name='br-int',has_traffic_filtering=True,id=1dad2357-2038-47d2-9787-f54ea149b6c6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dad2357-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.589 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:51 compute-0 podman[207796]: 2026-01-21 18:24:51.589137469 +0000 UTC m=+0.044150379 container remove 0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.589 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1dad2357-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.590 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.591 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.593 183284 INFO os_vif [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:c0:6f,bridge_name='br-int',has_traffic_filtering=True,id=1dad2357-2038-47d2-9787-f54ea149b6c6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1dad2357-20')
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.594 183284 INFO nova.virt.libvirt.driver [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Deleting instance files /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6_del
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.595 183284 INFO nova.virt.libvirt.driver [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Deletion of /var/lib/nova/instances/834c4219-2ee2-47cb-aa62-2ed8545ea4e6_del complete
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.596 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[79d0a1a5-473f-4cd7-894b-6e312bcff43d]: (4, ('Wed Jan 21 06:24:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9)\n0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9\nWed Jan 21 06:24:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9)\n0f939214e535ec6a74bcf45ab9708c0b132664b4f126e8a2b5f9ecf05c34ebd9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.597 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9beaf7af-44e1-4c90-9ed3-c3b65e814f99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.597 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.599 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:51 compute-0 kernel: tap405ec01b-70: left promiscuous mode
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.610 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.613 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[fd98d9d9-00a0-4731-9931-c0c644ec12e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.627 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[478513e0-26ab-48e9-be2b-fd603a1f90f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.629 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3a4cf8-0111-447d-b0a5-de0f65a60807]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.641 183284 INFO nova.compute.manager [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.642 183284 DEBUG oslo.service.loopingcall [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.642 183284 DEBUG nova.compute.manager [-] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:24:51 compute-0 nova_compute[183278]: 2026-01-21 18:24:51.642 183284 DEBUG nova.network.neutron [-] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.641 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[55555ef2-2749-4eaf-aa08-d8edbf9438b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441966, 'reachable_time': 32465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207819, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.644 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:24:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:24:51.644 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[7e3c603e-f7c9-423c-95a8-fd2d3bcc86b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:24:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d405ec01b\x2d76d3\x2d4c3c\x2da31b\x2d5f16d9641fbf.mount: Deactivated successfully.
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.072 183284 DEBUG nova.compute.manager [req-7ca0b8d3-4b3a-4323-9a6f-bf3c5daeb4d4 req-fae32863-1942-4281-88ef-6d08684b17ca 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Received event network-vif-unplugged-1dad2357-2038-47d2-9787-f54ea149b6c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.072 183284 DEBUG oslo_concurrency.lockutils [req-7ca0b8d3-4b3a-4323-9a6f-bf3c5daeb4d4 req-fae32863-1942-4281-88ef-6d08684b17ca 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.072 183284 DEBUG oslo_concurrency.lockutils [req-7ca0b8d3-4b3a-4323-9a6f-bf3c5daeb4d4 req-fae32863-1942-4281-88ef-6d08684b17ca 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.073 183284 DEBUG oslo_concurrency.lockutils [req-7ca0b8d3-4b3a-4323-9a6f-bf3c5daeb4d4 req-fae32863-1942-4281-88ef-6d08684b17ca 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.073 183284 DEBUG nova.compute.manager [req-7ca0b8d3-4b3a-4323-9a6f-bf3c5daeb4d4 req-fae32863-1942-4281-88ef-6d08684b17ca 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] No waiting events found dispatching network-vif-unplugged-1dad2357-2038-47d2-9787-f54ea149b6c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.073 183284 DEBUG nova.compute.manager [req-7ca0b8d3-4b3a-4323-9a6f-bf3c5daeb4d4 req-fae32863-1942-4281-88ef-6d08684b17ca 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Received event network-vif-unplugged-1dad2357-2038-47d2-9787-f54ea149b6c6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.276 183284 DEBUG nova.compute.manager [req-08a4b1bd-dbd8-4833-bea4-8ecc1ae27134 req-546f8abd-3925-475d-ac27-42f6f79e9a65 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Received event network-vif-plugged-260f28a8-7a1b-454a-830d-2f41597334af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.276 183284 DEBUG oslo_concurrency.lockutils [req-08a4b1bd-dbd8-4833-bea4-8ecc1ae27134 req-546f8abd-3925-475d-ac27-42f6f79e9a65 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "846c40aa-a089-4213-89d3-b56681e73e18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.276 183284 DEBUG oslo_concurrency.lockutils [req-08a4b1bd-dbd8-4833-bea4-8ecc1ae27134 req-546f8abd-3925-475d-ac27-42f6f79e9a65 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.276 183284 DEBUG oslo_concurrency.lockutils [req-08a4b1bd-dbd8-4833-bea4-8ecc1ae27134 req-546f8abd-3925-475d-ac27-42f6f79e9a65 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "846c40aa-a089-4213-89d3-b56681e73e18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.277 183284 DEBUG nova.compute.manager [req-08a4b1bd-dbd8-4833-bea4-8ecc1ae27134 req-546f8abd-3925-475d-ac27-42f6f79e9a65 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] No waiting events found dispatching network-vif-plugged-260f28a8-7a1b-454a-830d-2f41597334af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.277 183284 WARNING nova.compute.manager [req-08a4b1bd-dbd8-4833-bea4-8ecc1ae27134 req-546f8abd-3925-475d-ac27-42f6f79e9a65 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Received unexpected event network-vif-plugged-260f28a8-7a1b-454a-830d-2f41597334af for instance with vm_state deleted and task_state None.
Jan 21 18:24:52 compute-0 nova_compute[183278]: 2026-01-21 18:24:52.277 183284 DEBUG nova.compute.manager [req-08a4b1bd-dbd8-4833-bea4-8ecc1ae27134 req-546f8abd-3925-475d-ac27-42f6f79e9a65 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Received event network-vif-deleted-260f28a8-7a1b-454a-830d-2f41597334af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:24:53 compute-0 podman[207820]: 2026-01-21 18:24:53.021214377 +0000 UTC m=+0.075464307 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Jan 21 18:24:53 compute-0 nova_compute[183278]: 2026-01-21 18:24:53.066 183284 DEBUG nova.network.neutron [-] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:24:53 compute-0 nova_compute[183278]: 2026-01-21 18:24:53.425 183284 INFO nova.compute.manager [-] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Took 1.78 seconds to deallocate network for instance.
Jan 21 18:24:53 compute-0 nova_compute[183278]: 2026-01-21 18:24:53.493 183284 DEBUG oslo_concurrency.lockutils [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:53 compute-0 nova_compute[183278]: 2026-01-21 18:24:53.494 183284 DEBUG oslo_concurrency.lockutils [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:53 compute-0 nova_compute[183278]: 2026-01-21 18:24:53.506 183284 DEBUG oslo_concurrency.lockutils [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:53 compute-0 nova_compute[183278]: 2026-01-21 18:24:53.530 183284 INFO nova.scheduler.client.report [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Deleted allocations for instance 834c4219-2ee2-47cb-aa62-2ed8545ea4e6
Jan 21 18:24:53 compute-0 nova_compute[183278]: 2026-01-21 18:24:53.622 183284 DEBUG oslo_concurrency.lockutils [None req-36a35704-767e-42fb-8b4f-27afc18a7daa 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:54 compute-0 nova_compute[183278]: 2026-01-21 18:24:54.159 183284 DEBUG nova.compute.manager [req-65ec13ab-6157-49df-b81e-1cab3c18e75c req-0b0e26b9-81cd-4484-8723-51020d200aaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Received event network-vif-plugged-1dad2357-2038-47d2-9787-f54ea149b6c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:24:54 compute-0 nova_compute[183278]: 2026-01-21 18:24:54.159 183284 DEBUG oslo_concurrency.lockutils [req-65ec13ab-6157-49df-b81e-1cab3c18e75c req-0b0e26b9-81cd-4484-8723-51020d200aaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:24:54 compute-0 nova_compute[183278]: 2026-01-21 18:24:54.160 183284 DEBUG oslo_concurrency.lockutils [req-65ec13ab-6157-49df-b81e-1cab3c18e75c req-0b0e26b9-81cd-4484-8723-51020d200aaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:24:54 compute-0 nova_compute[183278]: 2026-01-21 18:24:54.160 183284 DEBUG oslo_concurrency.lockutils [req-65ec13ab-6157-49df-b81e-1cab3c18e75c req-0b0e26b9-81cd-4484-8723-51020d200aaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "834c4219-2ee2-47cb-aa62-2ed8545ea4e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:24:54 compute-0 nova_compute[183278]: 2026-01-21 18:24:54.160 183284 DEBUG nova.compute.manager [req-65ec13ab-6157-49df-b81e-1cab3c18e75c req-0b0e26b9-81cd-4484-8723-51020d200aaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] No waiting events found dispatching network-vif-plugged-1dad2357-2038-47d2-9787-f54ea149b6c6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:24:54 compute-0 nova_compute[183278]: 2026-01-21 18:24:54.160 183284 WARNING nova.compute.manager [req-65ec13ab-6157-49df-b81e-1cab3c18e75c req-0b0e26b9-81cd-4484-8723-51020d200aaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Received unexpected event network-vif-plugged-1dad2357-2038-47d2-9787-f54ea149b6c6 for instance with vm_state deleted and task_state None.
Jan 21 18:24:54 compute-0 nova_compute[183278]: 2026-01-21 18:24:54.161 183284 DEBUG nova.compute.manager [req-65ec13ab-6157-49df-b81e-1cab3c18e75c req-0b0e26b9-81cd-4484-8723-51020d200aaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Received event network-vif-deleted-1dad2357-2038-47d2-9787-f54ea149b6c6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:24:55 compute-0 nova_compute[183278]: 2026-01-21 18:24:55.178 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:56 compute-0 nova_compute[183278]: 2026-01-21 18:24:56.637 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:24:59 compute-0 podman[192560]: time="2026-01-21T18:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:24:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:24:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Jan 21 18:25:00 compute-0 podman[207843]: 2026-01-21 18:25:00.003402315 +0000 UTC m=+0.050544854 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 18:25:00 compute-0 podman[207842]: 2026-01-21 18:25:00.047169753 +0000 UTC m=+0.100036090 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 18:25:00 compute-0 nova_compute[183278]: 2026-01-21 18:25:00.225 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:01 compute-0 openstack_network_exporter[195402]: ERROR   18:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:25:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:25:01 compute-0 openstack_network_exporter[195402]: ERROR   18:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:25:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:25:01 compute-0 nova_compute[183278]: 2026-01-21 18:25:01.639 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:04 compute-0 nova_compute[183278]: 2026-01-21 18:25:04.791 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769019889.78905, 846c40aa-a089-4213-89d3-b56681e73e18 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:25:04 compute-0 nova_compute[183278]: 2026-01-21 18:25:04.792 183284 INFO nova.compute.manager [-] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] VM Stopped (Lifecycle Event)
Jan 21 18:25:04 compute-0 nova_compute[183278]: 2026-01-21 18:25:04.813 183284 DEBUG nova.compute.manager [None req-6de45a6d-b8bb-468e-81ef-2b4357f54d3f - - - - - -] [instance: 846c40aa-a089-4213-89d3-b56681e73e18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:25:05 compute-0 podman[207886]: 2026-01-21 18:25:05.006880649 +0000 UTC m=+0.059008598 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:25:05 compute-0 nova_compute[183278]: 2026-01-21 18:25:05.278 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:06 compute-0 nova_compute[183278]: 2026-01-21 18:25:06.553 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769019891.5517116, 834c4219-2ee2-47cb-aa62-2ed8545ea4e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:25:06 compute-0 nova_compute[183278]: 2026-01-21 18:25:06.554 183284 INFO nova.compute.manager [-] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] VM Stopped (Lifecycle Event)
Jan 21 18:25:06 compute-0 nova_compute[183278]: 2026-01-21 18:25:06.591 183284 DEBUG nova.compute.manager [None req-6293a489-2f9f-455a-a352-2db556106422 - - - - - -] [instance: 834c4219-2ee2-47cb-aa62-2ed8545ea4e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:25:06 compute-0 nova_compute[183278]: 2026-01-21 18:25:06.641 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:10 compute-0 nova_compute[183278]: 2026-01-21 18:25:10.280 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:11 compute-0 nova_compute[183278]: 2026-01-21 18:25:11.646 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:11 compute-0 nova_compute[183278]: 2026-01-21 18:25:11.821 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:11 compute-0 nova_compute[183278]: 2026-01-21 18:25:11.821 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 18:25:14 compute-0 nova_compute[183278]: 2026-01-21 18:25:14.829 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:14 compute-0 nova_compute[183278]: 2026-01-21 18:25:14.829 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:25:14 compute-0 nova_compute[183278]: 2026-01-21 18:25:14.829 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:25:14 compute-0 nova_compute[183278]: 2026-01-21 18:25:14.844 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.282 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.819 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.845 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.846 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.846 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.847 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.978 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.979 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5869MB free_disk=73.38153839111328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.979 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:15 compute-0 nova_compute[183278]: 2026-01-21 18:25:15.980 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:16 compute-0 nova_compute[183278]: 2026-01-21 18:25:16.088 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:25:16 compute-0 nova_compute[183278]: 2026-01-21 18:25:16.088 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:25:16 compute-0 nova_compute[183278]: 2026-01-21 18:25:16.137 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:25:16 compute-0 nova_compute[183278]: 2026-01-21 18:25:16.153 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:25:16 compute-0 nova_compute[183278]: 2026-01-21 18:25:16.174 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:25:16 compute-0 nova_compute[183278]: 2026-01-21 18:25:16.175 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:16 compute-0 nova_compute[183278]: 2026-01-21 18:25:16.651 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:18 compute-0 nova_compute[183278]: 2026-01-21 18:25:18.173 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:19 compute-0 nova_compute[183278]: 2026-01-21 18:25:19.820 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:19 compute-0 nova_compute[183278]: 2026-01-21 18:25:19.821 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:20.082 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:20.083 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:20.083 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:20 compute-0 nova_compute[183278]: 2026-01-21 18:25:20.285 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:20 compute-0 nova_compute[183278]: 2026-01-21 18:25:20.814 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:20 compute-0 nova_compute[183278]: 2026-01-21 18:25:20.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:20 compute-0 nova_compute[183278]: 2026-01-21 18:25:20.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:20 compute-0 nova_compute[183278]: 2026-01-21 18:25:20.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:25:21 compute-0 nova_compute[183278]: 2026-01-21 18:25:21.657 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:21 compute-0 nova_compute[183278]: 2026-01-21 18:25:21.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:21 compute-0 nova_compute[183278]: 2026-01-21 18:25:21.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:21 compute-0 nova_compute[183278]: 2026-01-21 18:25:21.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 18:25:21 compute-0 nova_compute[183278]: 2026-01-21 18:25:21.838 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 18:25:22 compute-0 ovn_controller[95419]: 2026-01-21T18:25:22Z|00106|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 21 18:25:24 compute-0 podman[207914]: 2026-01-21 18:25:24.003764312 +0000 UTC m=+0.059623983 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1755695350)
Jan 21 18:25:24 compute-0 nova_compute[183278]: 2026-01-21 18:25:24.834 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:25 compute-0 nova_compute[183278]: 2026-01-21 18:25:25.288 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:26 compute-0 nova_compute[183278]: 2026-01-21 18:25:26.663 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:29 compute-0 podman[192560]: time="2026-01-21T18:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:25:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:25:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 21 18:25:30 compute-0 nova_compute[183278]: 2026-01-21 18:25:30.289 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:31 compute-0 podman[207935]: 2026-01-21 18:25:31.017450412 +0000 UTC m=+0.076735328 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 18:25:31 compute-0 podman[207936]: 2026-01-21 18:25:31.044384784 +0000 UTC m=+0.090920741 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:25:31 compute-0 openstack_network_exporter[195402]: ERROR   18:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:25:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:25:31 compute-0 openstack_network_exporter[195402]: ERROR   18:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:25:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:25:31 compute-0 nova_compute[183278]: 2026-01-21 18:25:31.665 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:32 compute-0 nova_compute[183278]: 2026-01-21 18:25:32.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:35 compute-0 nova_compute[183278]: 2026-01-21 18:25:35.291 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:36 compute-0 podman[207983]: 2026-01-21 18:25:36.017471304 +0000 UTC m=+0.078505230 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:25:36 compute-0 nova_compute[183278]: 2026-01-21 18:25:36.668 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:40 compute-0 nova_compute[183278]: 2026-01-21 18:25:40.292 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:40 compute-0 nova_compute[183278]: 2026-01-21 18:25:40.719 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:25:41 compute-0 nova_compute[183278]: 2026-01-21 18:25:41.671 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.158 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.159 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.174 183284 DEBUG nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.272 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.272 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.280 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.281 183284 INFO nova.compute.claims [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.374 183284 DEBUG nova.compute.provider_tree [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.398 183284 DEBUG nova.scheduler.client.report [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.422 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.423 183284 DEBUG nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.479 183284 DEBUG nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.479 183284 DEBUG nova.network.neutron [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.501 183284 INFO nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.519 183284 DEBUG nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.609 183284 DEBUG nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.610 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.611 183284 INFO nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Creating image(s)
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.611 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "/var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.611 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.612 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.625 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.685 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.687 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.688 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.702 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.770 183284 DEBUG nova.policy [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41dc6e790bc54fbfaf5c6007d3fa5f63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.773 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:25:42 compute-0 nova_compute[183278]: 2026-01-21 18:25:42.774 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.072 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk 1073741824" returned: 0 in 0.298s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.074 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.075 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.134 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.135 183284 DEBUG nova.virt.disk.api [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Checking if we can resize image /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.136 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.199 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.200 183284 DEBUG nova.virt.disk.api [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Cannot resize image /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.201 183284 DEBUG nova.objects.instance [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'migration_context' on Instance uuid ddf15e49-2138-490c-b6b8-e70a211ff35c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.214 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.215 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Ensure instance console log exists: /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.215 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.215 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.216 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:43.260 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.260 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:43.261 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:25:43 compute-0 nova_compute[183278]: 2026-01-21 18:25:43.382 183284 DEBUG nova.network.neutron [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Successfully created port: 5cb37f70-859d-4c72-ad8e-d7c03e18e588 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.043 183284 DEBUG nova.network.neutron [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Successfully updated port: 5cb37f70-859d-4c72-ad8e-d7c03e18e588 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.055 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.056 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquired lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.056 183284 DEBUG nova.network.neutron [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.146 183284 DEBUG nova.compute.manager [req-c4f03f6d-30b6-41b9-a00c-8c684524fc5a req-0d385335-d067-48b5-8577-881cdba15340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-changed-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.146 183284 DEBUG nova.compute.manager [req-c4f03f6d-30b6-41b9-a00c-8c684524fc5a req-0d385335-d067-48b5-8577-881cdba15340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Refreshing instance network info cache due to event network-changed-5cb37f70-859d-4c72-ad8e-d7c03e18e588. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.147 183284 DEBUG oslo_concurrency.lockutils [req-c4f03f6d-30b6-41b9-a00c-8c684524fc5a req-0d385335-d067-48b5-8577-881cdba15340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.201 183284 DEBUG nova.network.neutron [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.804 183284 DEBUG nova.network.neutron [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Updating instance_info_cache with network_info: [{"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.836 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Releasing lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.836 183284 DEBUG nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Instance network_info: |[{"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.837 183284 DEBUG oslo_concurrency.lockutils [req-c4f03f6d-30b6-41b9-a00c-8c684524fc5a req-0d385335-d067-48b5-8577-881cdba15340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.837 183284 DEBUG nova.network.neutron [req-c4f03f6d-30b6-41b9-a00c-8c684524fc5a req-0d385335-d067-48b5-8577-881cdba15340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Refreshing network info cache for port 5cb37f70-859d-4c72-ad8e-d7c03e18e588 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.839 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Start _get_guest_xml network_info=[{"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.844 183284 WARNING nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.848 183284 DEBUG nova.virt.libvirt.host [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.848 183284 DEBUG nova.virt.libvirt.host [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.851 183284 DEBUG nova.virt.libvirt.host [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.851 183284 DEBUG nova.virt.libvirt.host [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.852 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.852 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.853 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.853 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.853 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.853 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.854 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.854 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.854 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.854 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.854 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.855 183284 DEBUG nova.virt.hardware [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.858 183284 DEBUG nova.virt.libvirt.vif [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:25:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1657890050',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1657890050',id=13,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-tqcqa7vq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:25:42Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=ddf15e49-2138-490c-b6b8-e70a211ff35c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.858 183284 DEBUG nova.network.os_vif_util [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.859 183284 DEBUG nova.network.os_vif_util [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:66:98,bridge_name='br-int',has_traffic_filtering=True,id=5cb37f70-859d-4c72-ad8e-d7c03e18e588,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cb37f70-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.859 183284 DEBUG nova.objects.instance [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid ddf15e49-2138-490c-b6b8-e70a211ff35c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.872 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <uuid>ddf15e49-2138-490c-b6b8-e70a211ff35c</uuid>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <name>instance-0000000d</name>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteStrategies-server-1657890050</nova:name>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:25:44</nova:creationTime>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:25:44 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:25:44 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:25:44 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:25:44 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:25:44 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:25:44 compute-0 nova_compute[183278]:         <nova:user uuid="41dc6e790bc54fbfaf5c6007d3fa5f63">tempest-TestExecuteStrategies-1753607426-project-member</nova:user>
Jan 21 18:25:44 compute-0 nova_compute[183278]:         <nova:project uuid="fe688847145f4dee992c72dd40bbc1ac">tempest-TestExecuteStrategies-1753607426</nova:project>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:25:44 compute-0 nova_compute[183278]:         <nova:port uuid="5cb37f70-859d-4c72-ad8e-d7c03e18e588">
Jan 21 18:25:44 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <system>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <entry name="serial">ddf15e49-2138-490c-b6b8-e70a211ff35c</entry>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <entry name="uuid">ddf15e49-2138-490c-b6b8-e70a211ff35c</entry>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     </system>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <os>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   </os>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <features>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   </features>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk.config"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:83:66:98"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <target dev="tap5cb37f70-85"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/console.log" append="off"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <video>
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     </video>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:25:44 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:25:44 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:25:44 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:25:44 compute-0 nova_compute[183278]: </domain>
Jan 21 18:25:44 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.874 183284 DEBUG nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Preparing to wait for external event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.874 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.874 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.875 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.875 183284 DEBUG nova.virt.libvirt.vif [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:25:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1657890050',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1657890050',id=13,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-tqcqa7vq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:25:42Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=ddf15e49-2138-490c-b6b8-e70a211ff35c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.876 183284 DEBUG nova.network.os_vif_util [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.877 183284 DEBUG nova.network.os_vif_util [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:66:98,bridge_name='br-int',has_traffic_filtering=True,id=5cb37f70-859d-4c72-ad8e-d7c03e18e588,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cb37f70-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.877 183284 DEBUG os_vif [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:66:98,bridge_name='br-int',has_traffic_filtering=True,id=5cb37f70-859d-4c72-ad8e-d7c03e18e588,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cb37f70-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.878 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.878 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.878 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.881 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.882 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5cb37f70-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.882 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5cb37f70-85, col_values=(('external_ids', {'iface-id': '5cb37f70-859d-4c72-ad8e-d7c03e18e588', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:66:98', 'vm-uuid': 'ddf15e49-2138-490c-b6b8-e70a211ff35c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.884 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:44 compute-0 NetworkManager[55506]: <info>  [1769019944.8852] manager: (tap5cb37f70-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.886 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.891 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.892 183284 INFO os_vif [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:66:98,bridge_name='br-int',has_traffic_filtering=True,id=5cb37f70-859d-4c72-ad8e-d7c03e18e588,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cb37f70-85')
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.947 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.947 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.948 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No VIF found with MAC fa:16:3e:83:66:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:25:44 compute-0 nova_compute[183278]: 2026-01-21 18:25:44.948 183284 INFO nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Using config drive
Jan 21 18:25:45 compute-0 nova_compute[183278]: 2026-01-21 18:25:45.309 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.028 183284 INFO nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Creating config drive at /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk.config
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.033 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4el6q21y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.160 183284 DEBUG oslo_concurrency.processutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4el6q21y" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:25:47 compute-0 kernel: tap5cb37f70-85: entered promiscuous mode
Jan 21 18:25:47 compute-0 NetworkManager[55506]: <info>  [1769019947.2208] manager: (tap5cb37f70-85): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Jan 21 18:25:47 compute-0 ovn_controller[95419]: 2026-01-21T18:25:47Z|00107|binding|INFO|Claiming lport 5cb37f70-859d-4c72-ad8e-d7c03e18e588 for this chassis.
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.221 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:47 compute-0 ovn_controller[95419]: 2026-01-21T18:25:47Z|00108|binding|INFO|5cb37f70-859d-4c72-ad8e-d7c03e18e588: Claiming fa:16:3e:83:66:98 10.100.0.10
Jan 21 18:25:47 compute-0 ovn_controller[95419]: 2026-01-21T18:25:47Z|00109|binding|INFO|Setting lport 5cb37f70-859d-4c72-ad8e-d7c03e18e588 ovn-installed in OVS
Jan 21 18:25:47 compute-0 ovn_controller[95419]: 2026-01-21T18:25:47Z|00110|binding|INFO|Setting lport 5cb37f70-859d-4c72-ad8e-d7c03e18e588 up in Southbound
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.235 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.236 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:66:98 10.100.0.10'], port_security=['fa:16:3e:83:66:98 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ddf15e49-2138-490c-b6b8-e70a211ff35c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=5cb37f70-859d-4c72-ad8e-d7c03e18e588) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.237 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.237 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 5cb37f70-859d-4c72-ad8e-d7c03e18e588 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf bound to our chassis
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.239 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.249 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d48299-4015-432a-ab0a-8cd20f7e1972]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.250 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap405ec01b-71 in ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.251 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap405ec01b-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.251 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[35973af2-7c93-4b8c-8698-d3ef149ea5e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.252 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[11300f46-7535-438a-863b-420c96584389]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 systemd-udevd[208042]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:25:47 compute-0 systemd-machined[154592]: New machine qemu-10-instance-0000000d.
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.263 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[80f5e795-6391-4be7-a3db-a5b90044169e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 NetworkManager[55506]: <info>  [1769019947.2683] device (tap5cb37f70-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:25:47 compute-0 NetworkManager[55506]: <info>  [1769019947.2690] device (tap5cb37f70-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:25:47 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000d.
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.275 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[29fb4d46-06a3-4849-ba45-d52baf4f906e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.300 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[eec3d204-21c3-4664-aefa-f4c6e9dadfd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 systemd-udevd[208047]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.305 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[81433d0e-9565-4faa-b36b-01ea95638df6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 NetworkManager[55506]: <info>  [1769019947.3075] manager: (tap405ec01b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.337 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[9b47d639-f4e8-4092-a7ec-b025612a860b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.339 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[af99f671-682c-43a9-80b9-031820f1cb2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 NetworkManager[55506]: <info>  [1769019947.3607] device (tap405ec01b-70): carrier: link connected
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.367 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[d1fe69a7-a177-40a4-877a-5425632b4cc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.383 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[6536204d-d060-42a2-aa6a-2db82b293e66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451096, 'reachable_time': 27006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208075, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.399 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[bee39397-d317-41df-ad02-38d0195f92a8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:9502'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451096, 'tstamp': 451096}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208076, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.416 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[29f72e25-8a74-4e25-b6b4-58339e07e9ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451096, 'reachable_time': 27006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208078, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.450 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae9f416-3172-448f-aefb-225bf54116aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.509 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[41a017cf-bf69-46f6-aecb-b0db419f4416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.511 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.511 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.511 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:25:47 compute-0 NetworkManager[55506]: <info>  [1769019947.5140] manager: (tap405ec01b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 21 18:25:47 compute-0 kernel: tap405ec01b-70: entered promiscuous mode
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.517 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.518 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:47 compute-0 ovn_controller[95419]: 2026-01-21T18:25:47Z|00111|binding|INFO|Releasing lport 9c897ad2-8ce5-4903-8c83-1ed8f117dcdd from this chassis (sb_readonly=0)
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.519 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.520 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[80ed048e-7563-4d07-bda0-0bd96c39933b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.520 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:25:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:47.521 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'env', 'PROCESS_TAG=haproxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.645 183284 DEBUG nova.compute.manager [req-9559ae30-f144-46a0-b9f4-c28e3478ae74 req-cdd4a16e-1829-4937-9e00-a9189f2188bd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.646 183284 DEBUG oslo_concurrency.lockutils [req-9559ae30-f144-46a0-b9f4-c28e3478ae74 req-cdd4a16e-1829-4937-9e00-a9189f2188bd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.646 183284 DEBUG oslo_concurrency.lockutils [req-9559ae30-f144-46a0-b9f4-c28e3478ae74 req-cdd4a16e-1829-4937-9e00-a9189f2188bd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.646 183284 DEBUG oslo_concurrency.lockutils [req-9559ae30-f144-46a0-b9f4-c28e3478ae74 req-cdd4a16e-1829-4937-9e00-a9189f2188bd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.647 183284 DEBUG nova.compute.manager [req-9559ae30-f144-46a0-b9f4-c28e3478ae74 req-cdd4a16e-1829-4937-9e00-a9189f2188bd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Processing event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.738 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019947.738049, ddf15e49-2138-490c-b6b8-e70a211ff35c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.739 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] VM Started (Lifecycle Event)
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.741 183284 DEBUG nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.744 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.747 183284 INFO nova.virt.libvirt.driver [-] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Instance spawned successfully.
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.747 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.766 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.772 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.775 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.776 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.776 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.777 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.777 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.777 183284 DEBUG nova.virt.libvirt.driver [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.801 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.803 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019947.7390082, ddf15e49-2138-490c-b6b8-e70a211ff35c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.803 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] VM Paused (Lifecycle Event)
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.825 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.831 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769019947.7435415, ddf15e49-2138-490c-b6b8-e70a211ff35c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.831 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] VM Resumed (Lifecycle Event)
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.856 183284 DEBUG nova.network.neutron [req-c4f03f6d-30b6-41b9-a00c-8c684524fc5a req-0d385335-d067-48b5-8577-881cdba15340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Updated VIF entry in instance network info cache for port 5cb37f70-859d-4c72-ad8e-d7c03e18e588. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.856 183284 DEBUG nova.network.neutron [req-c4f03f6d-30b6-41b9-a00c-8c684524fc5a req-0d385335-d067-48b5-8577-881cdba15340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Updating instance_info_cache with network_info: [{"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.860 183284 INFO nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Took 5.25 seconds to spawn the instance on the hypervisor.
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.861 183284 DEBUG nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.862 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.867 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.887 183284 DEBUG oslo_concurrency.lockutils [req-c4f03f6d-30b6-41b9-a00c-8c684524fc5a req-0d385335-d067-48b5-8577-881cdba15340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:25:47 compute-0 podman[208117]: 2026-01-21 18:25:47.897897181 +0000 UTC m=+0.063075267 container create 8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.899 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.927 183284 INFO nova.compute.manager [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Took 5.69 seconds to build instance.
Jan 21 18:25:47 compute-0 systemd[1]: Started libpod-conmon-8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99.scope.
Jan 21 18:25:47 compute-0 nova_compute[183278]: 2026-01-21 18:25:47.943 183284 DEBUG oslo_concurrency.lockutils [None req-a673cfea-756f-47e6-bb1b-f71ea8fb47f7 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:47 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:25:47 compute-0 podman[208117]: 2026-01-21 18:25:47.866174303 +0000 UTC m=+0.031352409 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:25:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8433fb588c03a69f19e5575dab6caf40ec2452c1ad4068f154dcd575c80c031/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:25:47 compute-0 podman[208117]: 2026-01-21 18:25:47.977208299 +0000 UTC m=+0.142386385 container init 8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 18:25:47 compute-0 podman[208117]: 2026-01-21 18:25:47.982266192 +0000 UTC m=+0.147444288 container start 8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 18:25:48 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208132]: [NOTICE]   (208136) : New worker (208138) forked
Jan 21 18:25:48 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208132]: [NOTICE]   (208136) : Loading success.
Jan 21 18:25:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:25:49.263 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:25:49 compute-0 nova_compute[183278]: 2026-01-21 18:25:49.746 183284 DEBUG nova.compute.manager [req-71917284-f2e9-4a54-b10c-b7d08c88340f req-66212328-be5d-4599-8950-9cc0ae1f16e1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:25:49 compute-0 nova_compute[183278]: 2026-01-21 18:25:49.747 183284 DEBUG oslo_concurrency.lockutils [req-71917284-f2e9-4a54-b10c-b7d08c88340f req-66212328-be5d-4599-8950-9cc0ae1f16e1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:25:49 compute-0 nova_compute[183278]: 2026-01-21 18:25:49.747 183284 DEBUG oslo_concurrency.lockutils [req-71917284-f2e9-4a54-b10c-b7d08c88340f req-66212328-be5d-4599-8950-9cc0ae1f16e1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:25:49 compute-0 nova_compute[183278]: 2026-01-21 18:25:49.747 183284 DEBUG oslo_concurrency.lockutils [req-71917284-f2e9-4a54-b10c-b7d08c88340f req-66212328-be5d-4599-8950-9cc0ae1f16e1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:25:49 compute-0 nova_compute[183278]: 2026-01-21 18:25:49.747 183284 DEBUG nova.compute.manager [req-71917284-f2e9-4a54-b10c-b7d08c88340f req-66212328-be5d-4599-8950-9cc0ae1f16e1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] No waiting events found dispatching network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:25:49 compute-0 nova_compute[183278]: 2026-01-21 18:25:49.748 183284 WARNING nova.compute.manager [req-71917284-f2e9-4a54-b10c-b7d08c88340f req-66212328-be5d-4599-8950-9cc0ae1f16e1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received unexpected event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 for instance with vm_state active and task_state None.
Jan 21 18:25:49 compute-0 nova_compute[183278]: 2026-01-21 18:25:49.886 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:50 compute-0 nova_compute[183278]: 2026-01-21 18:25:50.315 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:54 compute-0 nova_compute[183278]: 2026-01-21 18:25:54.904 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:55 compute-0 podman[208147]: 2026-01-21 18:25:55.020394504 +0000 UTC m=+0.076094633 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter)
Jan 21 18:25:55 compute-0 nova_compute[183278]: 2026-01-21 18:25:55.317 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:25:59 compute-0 podman[192560]: time="2026-01-21T18:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:25:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:25:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Jan 21 18:25:59 compute-0 nova_compute[183278]: 2026-01-21 18:25:59.907 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:00 compute-0 nova_compute[183278]: 2026-01-21 18:26:00.320 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:00 compute-0 ovn_controller[95419]: 2026-01-21T18:26:00Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:66:98 10.100.0.10
Jan 21 18:26:00 compute-0 ovn_controller[95419]: 2026-01-21T18:26:00Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:66:98 10.100.0.10
Jan 21 18:26:01 compute-0 openstack_network_exporter[195402]: ERROR   18:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:26:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:26:01 compute-0 openstack_network_exporter[195402]: ERROR   18:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:26:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:26:01 compute-0 anacron[143712]: Job `cron.weekly' started
Jan 21 18:26:01 compute-0 anacron[143712]: Job `cron.weekly' terminated
Jan 21 18:26:02 compute-0 podman[208186]: 2026-01-21 18:26:02.013474584 +0000 UTC m=+0.064876590 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 18:26:02 compute-0 podman[208185]: 2026-01-21 18:26:02.046583656 +0000 UTC m=+0.099354525 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:26:04 compute-0 nova_compute[183278]: 2026-01-21 18:26:04.911 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:05 compute-0 nova_compute[183278]: 2026-01-21 18:26:05.322 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:06 compute-0 podman[208228]: 2026-01-21 18:26:06.989414923 +0000 UTC m=+0.049924708 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:26:09 compute-0 nova_compute[183278]: 2026-01-21 18:26:09.914 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:10 compute-0 nova_compute[183278]: 2026-01-21 18:26:10.324 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:14 compute-0 nova_compute[183278]: 2026-01-21 18:26:14.916 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:15 compute-0 nova_compute[183278]: 2026-01-21 18:26:15.326 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:17 compute-0 nova_compute[183278]: 2026-01-21 18:26:17.009 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:26:17 compute-0 nova_compute[183278]: 2026-01-21 18:26:17.009 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:26:17 compute-0 nova_compute[183278]: 2026-01-21 18:26:17.009 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:26:17 compute-0 nova_compute[183278]: 2026-01-21 18:26:17.920 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:26:17 compute-0 nova_compute[183278]: 2026-01-21 18:26:17.921 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:26:17 compute-0 nova_compute[183278]: 2026-01-21 18:26:17.921 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:26:17 compute-0 nova_compute[183278]: 2026-01-21 18:26:17.921 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid ddf15e49-2138-490c-b6b8-e70a211ff35c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:26:19 compute-0 nova_compute[183278]: 2026-01-21 18:26:19.920 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:20.082 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:20.083 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:20.083 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.328 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.467 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Updating instance_info_cache with network_info: [{"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.488 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.489 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.490 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.490 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.491 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.517 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.518 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.518 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.519 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.577 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.637 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.638 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.693 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.849 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.850 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5696MB free_disk=73.3521957397461GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.850 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.850 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.912 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance ddf15e49-2138-490c-b6b8-e70a211ff35c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.912 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.912 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.984 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:26:20 compute-0 nova_compute[183278]: 2026-01-21 18:26:20.997 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:26:21 compute-0 nova_compute[183278]: 2026-01-21 18:26:21.013 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:26:21 compute-0 nova_compute[183278]: 2026-01-21 18:26:21.013 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:22 compute-0 nova_compute[183278]: 2026-01-21 18:26:22.339 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:26:22 compute-0 nova_compute[183278]: 2026-01-21 18:26:22.339 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:26:22 compute-0 nova_compute[183278]: 2026-01-21 18:26:22.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:26:22 compute-0 nova_compute[183278]: 2026-01-21 18:26:22.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:26:22 compute-0 nova_compute[183278]: 2026-01-21 18:26:22.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:26:23 compute-0 nova_compute[183278]: 2026-01-21 18:26:23.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:26:24 compute-0 nova_compute[183278]: 2026-01-21 18:26:24.923 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:25 compute-0 nova_compute[183278]: 2026-01-21 18:26:25.330 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:25 compute-0 podman[208259]: 2026-01-21 18:26:25.998731668 +0000 UTC m=+0.057865011 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, version=9.6, distribution-scope=public, release=1755695350)
Jan 21 18:26:28 compute-0 nova_compute[183278]: 2026-01-21 18:26:28.929 183284 DEBUG nova.compute.manager [None req-4a9a9837-3b74-472e-b581-b28ae2487cc9 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 502e4243-611b-433d-a766-9b485d51652d in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Jan 21 18:26:29 compute-0 nova_compute[183278]: 2026-01-21 18:26:29.060 183284 DEBUG nova.compute.provider_tree [None req-4a9a9837-3b74-472e-b581-b28ae2487cc9 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Updating resource provider 502e4243-611b-433d-a766-9b485d51652d generation from 18 to 22 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 18:26:29 compute-0 podman[192560]: time="2026-01-21T18:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:26:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:26:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Jan 21 18:26:29 compute-0 nova_compute[183278]: 2026-01-21 18:26:29.925 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:30 compute-0 nova_compute[183278]: 2026-01-21 18:26:30.332 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:31 compute-0 ovn_controller[95419]: 2026-01-21T18:26:31Z|00112|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Jan 21 18:26:31 compute-0 openstack_network_exporter[195402]: ERROR   18:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:26:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:26:31 compute-0 openstack_network_exporter[195402]: ERROR   18:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:26:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:26:32 compute-0 nova_compute[183278]: 2026-01-21 18:26:32.604 183284 DEBUG nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Check if temp file /var/lib/nova/instances/tmp3he_921u exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 18:26:32 compute-0 nova_compute[183278]: 2026-01-21 18:26:32.605 183284 DEBUG nova.compute.manager [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3he_921u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ddf15e49-2138-490c-b6b8-e70a211ff35c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 18:26:33 compute-0 podman[208281]: 2026-01-21 18:26:33.017007648 +0000 UTC m=+0.065589717 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 21 18:26:33 compute-0 podman[208280]: 2026-01-21 18:26:33.062428468 +0000 UTC m=+0.114858060 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:26:33 compute-0 nova_compute[183278]: 2026-01-21 18:26:33.755 183284 DEBUG oslo_concurrency.processutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:26:33 compute-0 nova_compute[183278]: 2026-01-21 18:26:33.832 183284 DEBUG oslo_concurrency.processutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:26:33 compute-0 nova_compute[183278]: 2026-01-21 18:26:33.833 183284 DEBUG oslo_concurrency.processutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:26:33 compute-0 nova_compute[183278]: 2026-01-21 18:26:33.910 183284 DEBUG oslo_concurrency.processutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:26:34 compute-0 nova_compute[183278]: 2026-01-21 18:26:34.929 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:35 compute-0 nova_compute[183278]: 2026-01-21 18:26:35.334 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:37 compute-0 sshd-session[208332]: Accepted publickey for nova from 192.168.122.101 port 43196 ssh2: ECDSA SHA256:29a5JNhHHz2bb0ACqZTr6qOKeSRnhiTRA8SK+rzn9gs
Jan 21 18:26:37 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:26:37 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:26:37 compute-0 systemd-logind[782]: New session 32 of user nova.
Jan 21 18:26:37 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:26:37 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:26:37 compute-0 systemd[208346]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:26:37 compute-0 podman[208334]: 2026-01-21 18:26:37.751734052 +0000 UTC m=+0.053045165 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:26:37 compute-0 systemd[208346]: Queued start job for default target Main User Target.
Jan 21 18:26:37 compute-0 systemd[208346]: Created slice User Application Slice.
Jan 21 18:26:37 compute-0 systemd[208346]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:26:37 compute-0 systemd[208346]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:26:37 compute-0 systemd[208346]: Reached target Paths.
Jan 21 18:26:37 compute-0 systemd[208346]: Reached target Timers.
Jan 21 18:26:37 compute-0 systemd[208346]: Starting D-Bus User Message Bus Socket...
Jan 21 18:26:37 compute-0 systemd[208346]: Starting Create User's Volatile Files and Directories...
Jan 21 18:26:37 compute-0 systemd[208346]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:26:37 compute-0 systemd[208346]: Reached target Sockets.
Jan 21 18:26:37 compute-0 systemd[208346]: Finished Create User's Volatile Files and Directories.
Jan 21 18:26:37 compute-0 systemd[208346]: Reached target Basic System.
Jan 21 18:26:37 compute-0 systemd[208346]: Reached target Main User Target.
Jan 21 18:26:37 compute-0 systemd[208346]: Startup finished in 137ms.
Jan 21 18:26:37 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:26:37 compute-0 systemd[1]: Started Session 32 of User nova.
Jan 21 18:26:37 compute-0 sshd-session[208332]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:26:37 compute-0 sshd-session[208373]: Received disconnect from 192.168.122.101 port 43196:11: disconnected by user
Jan 21 18:26:37 compute-0 sshd-session[208373]: Disconnected from user nova 192.168.122.101 port 43196
Jan 21 18:26:37 compute-0 sshd-session[208332]: pam_unix(sshd:session): session closed for user nova
Jan 21 18:26:37 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Jan 21 18:26:37 compute-0 systemd-logind[782]: Session 32 logged out. Waiting for processes to exit.
Jan 21 18:26:37 compute-0 systemd-logind[782]: Removed session 32.
Jan 21 18:26:39 compute-0 nova_compute[183278]: 2026-01-21 18:26:39.222 183284 DEBUG nova.compute.manager [req-2bba3a5d-ff5e-4365-b0ed-152adb4ca7df req-75e168e6-d9aa-45d6-b639-2ec7a9b64648 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-unplugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:26:39 compute-0 nova_compute[183278]: 2026-01-21 18:26:39.223 183284 DEBUG oslo_concurrency.lockutils [req-2bba3a5d-ff5e-4365-b0ed-152adb4ca7df req-75e168e6-d9aa-45d6-b639-2ec7a9b64648 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:39 compute-0 nova_compute[183278]: 2026-01-21 18:26:39.224 183284 DEBUG oslo_concurrency.lockutils [req-2bba3a5d-ff5e-4365-b0ed-152adb4ca7df req-75e168e6-d9aa-45d6-b639-2ec7a9b64648 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:39 compute-0 nova_compute[183278]: 2026-01-21 18:26:39.224 183284 DEBUG oslo_concurrency.lockutils [req-2bba3a5d-ff5e-4365-b0ed-152adb4ca7df req-75e168e6-d9aa-45d6-b639-2ec7a9b64648 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:39 compute-0 nova_compute[183278]: 2026-01-21 18:26:39.224 183284 DEBUG nova.compute.manager [req-2bba3a5d-ff5e-4365-b0ed-152adb4ca7df req-75e168e6-d9aa-45d6-b639-2ec7a9b64648 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] No waiting events found dispatching network-vif-unplugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:26:39 compute-0 nova_compute[183278]: 2026-01-21 18:26:39.225 183284 DEBUG nova.compute.manager [req-2bba3a5d-ff5e-4365-b0ed-152adb4ca7df req-75e168e6-d9aa-45d6-b639-2ec7a9b64648 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-unplugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:26:39 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 18:26:39 compute-0 nova_compute[183278]: 2026-01-21 18:26:39.931 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:40 compute-0 nova_compute[183278]: 2026-01-21 18:26:40.337 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:41 compute-0 nova_compute[183278]: 2026-01-21 18:26:41.900 183284 DEBUG nova.compute.manager [req-8029ae6e-ec74-4e9b-b9fe-2883845d9f58 req-bae1ae0d-368e-4265-9cbe-64d60c879657 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:26:41 compute-0 nova_compute[183278]: 2026-01-21 18:26:41.901 183284 DEBUG oslo_concurrency.lockutils [req-8029ae6e-ec74-4e9b-b9fe-2883845d9f58 req-bae1ae0d-368e-4265-9cbe-64d60c879657 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:41 compute-0 nova_compute[183278]: 2026-01-21 18:26:41.901 183284 DEBUG oslo_concurrency.lockutils [req-8029ae6e-ec74-4e9b-b9fe-2883845d9f58 req-bae1ae0d-368e-4265-9cbe-64d60c879657 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:41 compute-0 nova_compute[183278]: 2026-01-21 18:26:41.901 183284 DEBUG oslo_concurrency.lockutils [req-8029ae6e-ec74-4e9b-b9fe-2883845d9f58 req-bae1ae0d-368e-4265-9cbe-64d60c879657 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:41 compute-0 nova_compute[183278]: 2026-01-21 18:26:41.902 183284 DEBUG nova.compute.manager [req-8029ae6e-ec74-4e9b-b9fe-2883845d9f58 req-bae1ae0d-368e-4265-9cbe-64d60c879657 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] No waiting events found dispatching network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:26:41 compute-0 nova_compute[183278]: 2026-01-21 18:26:41.902 183284 WARNING nova.compute.manager [req-8029ae6e-ec74-4e9b-b9fe-2883845d9f58 req-bae1ae0d-368e-4265-9cbe-64d60c879657 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received unexpected event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 for instance with vm_state active and task_state migrating.
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.474 183284 INFO nova.compute.manager [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Took 8.56 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.475 183284 DEBUG nova.compute.manager [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.492 183284 DEBUG nova.compute.manager [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp3he_921u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ddf15e49-2138-490c-b6b8-e70a211ff35c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1338e1bd-065a-48bf-b237-f65848f60898),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.509 183284 DEBUG nova.objects.instance [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid ddf15e49-2138-490c-b6b8-e70a211ff35c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.511 183284 DEBUG nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.512 183284 DEBUG nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.513 183284 DEBUG nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.527 183284 DEBUG nova.virt.libvirt.vif [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:25:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1657890050',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1657890050',id=13,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:25:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-tqcqa7vq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:25:47Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=ddf15e49-2138-490c-b6b8-e70a211ff35c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.528 183284 DEBUG nova.network.os_vif_util [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.528 183284 DEBUG nova.network.os_vif_util [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:66:98,bridge_name='br-int',has_traffic_filtering=True,id=5cb37f70-859d-4c72-ad8e-d7c03e18e588,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cb37f70-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.529 183284 DEBUG nova.virt.libvirt.migration [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 18:26:42 compute-0 nova_compute[183278]:   <mac address="fa:16:3e:83:66:98"/>
Jan 21 18:26:42 compute-0 nova_compute[183278]:   <model type="virtio"/>
Jan 21 18:26:42 compute-0 nova_compute[183278]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:26:42 compute-0 nova_compute[183278]:   <mtu size="1442"/>
Jan 21 18:26:42 compute-0 nova_compute[183278]:   <target dev="tap5cb37f70-85"/>
Jan 21 18:26:42 compute-0 nova_compute[183278]: </interface>
Jan 21 18:26:42 compute-0 nova_compute[183278]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 18:26:42 compute-0 nova_compute[183278]: 2026-01-21 18:26:42.529 183284 DEBUG nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.016 183284 DEBUG nova.virt.libvirt.migration [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.017 183284 INFO nova.virt.libvirt.migration [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.118 183284 INFO nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.622 183284 DEBUG nova.virt.libvirt.migration [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.623 183284 DEBUG nova.virt.libvirt.migration [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.792 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020003.791497, ddf15e49-2138-490c-b6b8-e70a211ff35c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.793 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] VM Paused (Lifecycle Event)
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.816 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.823 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.879 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 18:26:43 compute-0 kernel: tap5cb37f70-85 (unregistering): left promiscuous mode
Jan 21 18:26:43 compute-0 NetworkManager[55506]: <info>  [1769020003.9498] device (tap5cb37f70-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:26:43 compute-0 ovn_controller[95419]: 2026-01-21T18:26:43Z|00113|binding|INFO|Releasing lport 5cb37f70-859d-4c72-ad8e-d7c03e18e588 from this chassis (sb_readonly=0)
Jan 21 18:26:43 compute-0 ovn_controller[95419]: 2026-01-21T18:26:43Z|00114|binding|INFO|Setting lport 5cb37f70-859d-4c72-ad8e-d7c03e18e588 down in Southbound
Jan 21 18:26:43 compute-0 ovn_controller[95419]: 2026-01-21T18:26:43Z|00115|binding|INFO|Removing iface tap5cb37f70-85 ovn-installed in OVS
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.964 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:43.970 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:66:98 10.100.0.10'], port_security=['fa:16:3e:83:66:98 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '88a62794-b4a4-47e3-9cce-91e574e684c1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ddf15e49-2138-490c-b6b8-e70a211ff35c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '8', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=5cb37f70-859d-4c72-ad8e-d7c03e18e588) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:26:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:43.971 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 5cb37f70-859d-4c72-ad8e-d7c03e18e588 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf unbound from our chassis
Jan 21 18:26:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:43.972 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:26:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:43.975 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d934e620-b540-4997-8900-94ee3f33fadf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:26:43 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:43.976 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace which is not needed anymore
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.977 183284 DEBUG nova.compute.manager [req-a3e64c7f-2857-4254-9005-a2a57f95c109 req-41e89b60-1f7b-4b12-bfcf-3e94f292f531 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-changed-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.978 183284 DEBUG nova.compute.manager [req-a3e64c7f-2857-4254-9005-a2a57f95c109 req-41e89b60-1f7b-4b12-bfcf-3e94f292f531 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Refreshing instance network info cache due to event network-changed-5cb37f70-859d-4c72-ad8e-d7c03e18e588. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.978 183284 DEBUG oslo_concurrency.lockutils [req-a3e64c7f-2857-4254-9005-a2a57f95c109 req-41e89b60-1f7b-4b12-bfcf-3e94f292f531 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.978 183284 DEBUG oslo_concurrency.lockutils [req-a3e64c7f-2857-4254-9005-a2a57f95c109 req-41e89b60-1f7b-4b12-bfcf-3e94f292f531 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.979 183284 DEBUG nova.network.neutron [req-a3e64c7f-2857-4254-9005-a2a57f95c109 req-41e89b60-1f7b-4b12-bfcf-3e94f292f531 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Refreshing network info cache for port 5cb37f70-859d-4c72-ad8e-d7c03e18e588 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:26:43 compute-0 nova_compute[183278]: 2026-01-21 18:26:43.981 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:44 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 21 18:26:44 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Consumed 15.071s CPU time.
Jan 21 18:26:44 compute-0 systemd-machined[154592]: Machine qemu-10-instance-0000000d terminated.
Jan 21 18:26:44 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208132]: [NOTICE]   (208136) : haproxy version is 2.8.14-c23fe91
Jan 21 18:26:44 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208132]: [NOTICE]   (208136) : path to executable is /usr/sbin/haproxy
Jan 21 18:26:44 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208132]: [WARNING]  (208136) : Exiting Master process...
Jan 21 18:26:44 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208132]: [WARNING]  (208136) : Exiting Master process...
Jan 21 18:26:44 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208132]: [ALERT]    (208136) : Current worker (208138) exited with code 143 (Terminated)
Jan 21 18:26:44 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208132]: [WARNING]  (208136) : All workers exited. Exiting... (0)
Jan 21 18:26:44 compute-0 systemd[1]: libpod-8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99.scope: Deactivated successfully.
Jan 21 18:26:44 compute-0 podman[208418]: 2026-01-21 18:26:44.112237279 +0000 UTC m=+0.044668282 container died 8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 18:26:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99-userdata-shm.mount: Deactivated successfully.
Jan 21 18:26:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-c8433fb588c03a69f19e5575dab6caf40ec2452c1ad4068f154dcd575c80c031-merged.mount: Deactivated successfully.
Jan 21 18:26:44 compute-0 podman[208418]: 2026-01-21 18:26:44.148024404 +0000 UTC m=+0.080455407 container cleanup 8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:26:44 compute-0 systemd[1]: libpod-conmon-8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99.scope: Deactivated successfully.
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.180 183284 DEBUG nova.virt.libvirt.guest [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.181 183284 INFO nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Migration operation has completed
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.181 183284 INFO nova.compute.manager [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] _post_live_migration() is started..
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.186 183284 DEBUG nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.186 183284 DEBUG nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.186 183284 DEBUG nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 18:26:44 compute-0 podman[208456]: 2026-01-21 18:26:44.207219817 +0000 UTC m=+0.040296776 container remove 8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 18:26:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:44.211 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4c2e544b-abac-46af-9696-4cfb8a6f378a]: (4, ('Wed Jan 21 06:26:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99)\n8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99\nWed Jan 21 06:26:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99)\n8088761110efd0583ede328b84701b1c2d9a9122282d6771bfb70aade393bf99\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:26:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:44.213 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d5e1afa0-554a-438c-b493-df7518e1a09b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:26:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:44.213 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.215 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.226 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:44 compute-0 kernel: tap405ec01b-70: left promiscuous mode
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.231 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:44.233 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e9cba422-7f11-4b92-a303-0c36209b7587]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:26:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:44.242 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[990ba5d6-2a4f-4d14-8b4d-bee8bffa2788]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:26:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:44.244 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[c64d7713-b80b-4503-a688-2d009383d131]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:26:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:44.256 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[27f8ec27-e463-4bb1-b7b1-1788525e498a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451090, 'reachable_time': 43047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208483, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:26:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d405ec01b\x2d76d3\x2d4c3c\x2da31b\x2d5f16d9641fbf.mount: Deactivated successfully.
Jan 21 18:26:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:44.261 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:26:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:44.261 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[b378be85-2c80-425a-9af2-dbbbe71a309c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:26:44 compute-0 nova_compute[183278]: 2026-01-21 18:26:44.934 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.101 183284 DEBUG nova.compute.manager [req-61823771-637a-42c8-ace9-959f96211fe9 req-9f82fcb3-95c3-4596-9a6d-672edef9e17e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-unplugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.102 183284 DEBUG oslo_concurrency.lockutils [req-61823771-637a-42c8-ace9-959f96211fe9 req-9f82fcb3-95c3-4596-9a6d-672edef9e17e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.102 183284 DEBUG oslo_concurrency.lockutils [req-61823771-637a-42c8-ace9-959f96211fe9 req-9f82fcb3-95c3-4596-9a6d-672edef9e17e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.102 183284 DEBUG oslo_concurrency.lockutils [req-61823771-637a-42c8-ace9-959f96211fe9 req-9f82fcb3-95c3-4596-9a6d-672edef9e17e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.102 183284 DEBUG nova.compute.manager [req-61823771-637a-42c8-ace9-959f96211fe9 req-9f82fcb3-95c3-4596-9a6d-672edef9e17e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] No waiting events found dispatching network-vif-unplugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.103 183284 DEBUG nova.compute.manager [req-61823771-637a-42c8-ace9-959f96211fe9 req-9f82fcb3-95c3-4596-9a6d-672edef9e17e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-unplugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.338 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.824 183284 DEBUG nova.network.neutron [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Activated binding for port 5cb37f70-859d-4c72-ad8e-d7c03e18e588 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.825 183284 DEBUG nova.compute.manager [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.826 183284 DEBUG nova.virt.libvirt.vif [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:25:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1657890050',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1657890050',id=13,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:25:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-tqcqa7vq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:26:30Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=ddf15e49-2138-490c-b6b8-e70a211ff35c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.826 183284 DEBUG nova.network.os_vif_util [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.826 183284 DEBUG nova.network.os_vif_util [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:66:98,bridge_name='br-int',has_traffic_filtering=True,id=5cb37f70-859d-4c72-ad8e-d7c03e18e588,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cb37f70-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.827 183284 DEBUG os_vif [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:66:98,bridge_name='br-int',has_traffic_filtering=True,id=5cb37f70-859d-4c72-ad8e-d7c03e18e588,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cb37f70-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.828 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.828 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5cb37f70-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.829 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.831 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.833 183284 INFO os_vif [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:66:98,bridge_name='br-int',has_traffic_filtering=True,id=5cb37f70-859d-4c72-ad8e-d7c03e18e588,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cb37f70-85')
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.833 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.833 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.834 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.834 183284 DEBUG nova.compute.manager [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.834 183284 INFO nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Deleting instance files /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c_del
Jan 21 18:26:45 compute-0 nova_compute[183278]: 2026-01-21 18:26:45.835 183284 INFO nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Deletion of /var/lib/nova/instances/ddf15e49-2138-490c-b6b8-e70a211ff35c_del complete
Jan 21 18:26:46 compute-0 nova_compute[183278]: 2026-01-21 18:26:46.886 183284 DEBUG nova.network.neutron [req-a3e64c7f-2857-4254-9005-a2a57f95c109 req-41e89b60-1f7b-4b12-bfcf-3e94f292f531 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Updated VIF entry in instance network info cache for port 5cb37f70-859d-4c72-ad8e-d7c03e18e588. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:26:46 compute-0 nova_compute[183278]: 2026-01-21 18:26:46.886 183284 DEBUG nova.network.neutron [req-a3e64c7f-2857-4254-9005-a2a57f95c109 req-41e89b60-1f7b-4b12-bfcf-3e94f292f531 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Updating instance_info_cache with network_info: [{"id": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "address": "fa:16:3e:83:66:98", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cb37f70-85", "ovs_interfaceid": "5cb37f70-859d-4c72-ad8e-d7c03e18e588", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:26:46 compute-0 nova_compute[183278]: 2026-01-21 18:26:46.910 183284 DEBUG oslo_concurrency.lockutils [req-a3e64c7f-2857-4254-9005-a2a57f95c109 req-41e89b60-1f7b-4b12-bfcf-3e94f292f531 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-ddf15e49-2138-490c-b6b8-e70a211ff35c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.217 183284 DEBUG nova.compute.manager [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.217 183284 DEBUG oslo_concurrency.lockutils [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.217 183284 DEBUG oslo_concurrency.lockutils [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.218 183284 DEBUG oslo_concurrency.lockutils [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.218 183284 DEBUG nova.compute.manager [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] No waiting events found dispatching network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.218 183284 WARNING nova.compute.manager [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received unexpected event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 for instance with vm_state active and task_state migrating.
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.218 183284 DEBUG nova.compute.manager [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.219 183284 DEBUG oslo_concurrency.lockutils [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.219 183284 DEBUG oslo_concurrency.lockutils [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.219 183284 DEBUG oslo_concurrency.lockutils [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.219 183284 DEBUG nova.compute.manager [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] No waiting events found dispatching network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.220 183284 WARNING nova.compute.manager [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received unexpected event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 for instance with vm_state active and task_state migrating.
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.220 183284 DEBUG nova.compute.manager [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.220 183284 DEBUG oslo_concurrency.lockutils [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.220 183284 DEBUG oslo_concurrency.lockutils [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.220 183284 DEBUG oslo_concurrency.lockutils [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.221 183284 DEBUG nova.compute.manager [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] No waiting events found dispatching network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:26:47 compute-0 nova_compute[183278]: 2026-01-21 18:26:47.221 183284 WARNING nova.compute.manager [req-6ea0cf35-c65b-4814-b4a3-6a39074fd8d4 req-f52c0f68-0aa6-4b3a-a009-d6902818bd7e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Received unexpected event network-vif-plugged-5cb37f70-859d-4c72-ad8e-d7c03e18e588 for instance with vm_state active and task_state migrating.
Jan 21 18:26:48 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:26:48 compute-0 systemd[208346]: Activating special unit Exit the Session...
Jan 21 18:26:48 compute-0 systemd[208346]: Stopped target Main User Target.
Jan 21 18:26:48 compute-0 systemd[208346]: Stopped target Basic System.
Jan 21 18:26:48 compute-0 systemd[208346]: Stopped target Paths.
Jan 21 18:26:48 compute-0 systemd[208346]: Stopped target Sockets.
Jan 21 18:26:48 compute-0 systemd[208346]: Stopped target Timers.
Jan 21 18:26:48 compute-0 systemd[208346]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:26:48 compute-0 systemd[208346]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:26:48 compute-0 systemd[208346]: Closed D-Bus User Message Bus Socket.
Jan 21 18:26:48 compute-0 systemd[208346]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:26:48 compute-0 systemd[208346]: Removed slice User Application Slice.
Jan 21 18:26:48 compute-0 systemd[208346]: Reached target Shutdown.
Jan 21 18:26:48 compute-0 systemd[208346]: Finished Exit the Session.
Jan 21 18:26:48 compute-0 systemd[208346]: Reached target Exit the Session.
Jan 21 18:26:48 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:26:48 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:26:48 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:26:48 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:26:48 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:26:48 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:26:48 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:26:48 compute-0 sshd-session[208485]: Invalid user ansible_user from 64.227.98.100 port 39712
Jan 21 18:26:48 compute-0 sshd-session[208485]: Connection closed by invalid user ansible_user 64.227.98.100 port 39712 [preauth]
Jan 21 18:26:50 compute-0 nova_compute[183278]: 2026-01-21 18:26:50.340 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:50 compute-0 nova_compute[183278]: 2026-01-21 18:26:50.830 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.054 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.054 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.054 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "ddf15e49-2138-490c-b6b8-e70a211ff35c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.074 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.074 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.075 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.075 183284 DEBUG nova.compute.resource_tracker [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.247 183284 WARNING nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.249 183284 DEBUG nova.compute.resource_tracker [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5858MB free_disk=73.38150024414062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.249 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.249 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.449 183284 DEBUG nova.compute.resource_tracker [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration for instance ddf15e49-2138-490c-b6b8-e70a211ff35c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.487 183284 DEBUG nova.compute.resource_tracker [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.520 183284 DEBUG nova.compute.resource_tracker [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration 1338e1bd-065a-48bf-b237-f65848f60898 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.521 183284 DEBUG nova.compute.resource_tracker [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.521 183284 DEBUG nova.compute.resource_tracker [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.579 183284 DEBUG nova.compute.provider_tree [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.595 183284 DEBUG nova.scheduler.client.report [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.613 183284 DEBUG nova.compute.resource_tracker [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.613 183284 DEBUG oslo_concurrency.lockutils [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.619 183284 INFO nova.compute.manager [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.707 183284 INFO nova.scheduler.client.report [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Deleted allocation for migration 1338e1bd-065a-48bf-b237-f65848f60898
Jan 21 18:26:52 compute-0 nova_compute[183278]: 2026-01-21 18:26:52.708 183284 DEBUG nova.virt.libvirt.driver [None req-4d4f547b-d8c7-4c9e-9839-89b3775dfe1a 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 18:26:55 compute-0 nova_compute[183278]: 2026-01-21 18:26:55.343 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:55 compute-0 nova_compute[183278]: 2026-01-21 18:26:55.832 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:57 compute-0 podman[208488]: 2026-01-21 18:26:57.037435363 +0000 UTC m=+0.092567351 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Jan 21 18:26:57 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:57.282 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:26:57 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:26:57.283 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:26:57 compute-0 nova_compute[183278]: 2026-01-21 18:26:57.283 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:26:59 compute-0 nova_compute[183278]: 2026-01-21 18:26:59.180 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020004.1799443, ddf15e49-2138-490c-b6b8-e70a211ff35c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:26:59 compute-0 nova_compute[183278]: 2026-01-21 18:26:59.181 183284 INFO nova.compute.manager [-] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] VM Stopped (Lifecycle Event)
Jan 21 18:26:59 compute-0 nova_compute[183278]: 2026-01-21 18:26:59.198 183284 DEBUG nova.compute.manager [None req-01919902-23ce-4946-8859-75c558ecd8b6 - - - - - -] [instance: ddf15e49-2138-490c-b6b8-e70a211ff35c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:26:59 compute-0 podman[192560]: time="2026-01-21T18:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:26:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:26:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Jan 21 18:27:00 compute-0 nova_compute[183278]: 2026-01-21 18:27:00.345 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:00 compute-0 nova_compute[183278]: 2026-01-21 18:27:00.835 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:01 compute-0 openstack_network_exporter[195402]: ERROR   18:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:27:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:27:01 compute-0 openstack_network_exporter[195402]: ERROR   18:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:27:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:27:03 compute-0 podman[208509]: 2026-01-21 18:27:03.993259613 +0000 UTC m=+0.049537849 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:27:04 compute-0 podman[208508]: 2026-01-21 18:27:04.012367526 +0000 UTC m=+0.071563673 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 21 18:27:05 compute-0 nova_compute[183278]: 2026-01-21 18:27:05.346 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:05 compute-0 nova_compute[183278]: 2026-01-21 18:27:05.836 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:06 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:06.285 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:27:07 compute-0 podman[208552]: 2026-01-21 18:27:07.990305939 +0000 UTC m=+0.050187676 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:27:10 compute-0 nova_compute[183278]: 2026-01-21 18:27:10.347 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:10 compute-0 nova_compute[183278]: 2026-01-21 18:27:10.838 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:15 compute-0 nova_compute[183278]: 2026-01-21 18:27:15.349 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:15 compute-0 nova_compute[183278]: 2026-01-21 18:27:15.841 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:16 compute-0 nova_compute[183278]: 2026-01-21 18:27:16.653 183284 DEBUG nova.compute.manager [None req-724ba89e-4de9-41cd-bc7f-a752dcb83fd2 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 502e4243-611b-433d-a766-9b485d51652d in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Jan 21 18:27:16 compute-0 nova_compute[183278]: 2026-01-21 18:27:16.693 183284 DEBUG nova.compute.provider_tree [None req-724ba89e-4de9-41cd-bc7f-a752dcb83fd2 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Updating resource provider 502e4243-611b-433d-a766-9b485d51652d generation from 22 to 25 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 18:27:16 compute-0 nova_compute[183278]: 2026-01-21 18:27:16.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:16 compute-0 nova_compute[183278]: 2026-01-21 18:27:16.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:27:16 compute-0 nova_compute[183278]: 2026-01-21 18:27:16.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:27:16 compute-0 nova_compute[183278]: 2026-01-21 18:27:16.841 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.853 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.854 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.854 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.854 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.983 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.984 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5869MB free_disk=73.38154983520508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.984 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:19 compute-0 nova_compute[183278]: 2026-01-21 18:27:19.984 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:20 compute-0 nova_compute[183278]: 2026-01-21 18:27:20.038 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:27:20 compute-0 nova_compute[183278]: 2026-01-21 18:27:20.039 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:27:20 compute-0 nova_compute[183278]: 2026-01-21 18:27:20.058 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:27:20 compute-0 nova_compute[183278]: 2026-01-21 18:27:20.079 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:27:20 compute-0 nova_compute[183278]: 2026-01-21 18:27:20.081 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:27:20 compute-0 nova_compute[183278]: 2026-01-21 18:27:20.081 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:20.084 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:20.084 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:20.084 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:20 compute-0 nova_compute[183278]: 2026-01-21 18:27:20.352 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:20 compute-0 nova_compute[183278]: 2026-01-21 18:27:20.872 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:22 compute-0 nova_compute[183278]: 2026-01-21 18:27:22.077 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:22 compute-0 nova_compute[183278]: 2026-01-21 18:27:22.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.193 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.193 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.213 183284 DEBUG nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.280 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.281 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.291 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.291 183284 INFO nova.compute.claims [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.396 183284 DEBUG nova.compute.provider_tree [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.413 183284 DEBUG nova.scheduler.client.report [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.648 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.649 183284 DEBUG nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.918 183284 DEBUG nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:27:23 compute-0 nova_compute[183278]: 2026-01-21 18:27:23.918 183284 DEBUG nova.network.neutron [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.103 183284 INFO nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.127 183284 DEBUG nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.275 183284 DEBUG nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.276 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.276 183284 INFO nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Creating image(s)
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.277 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "/var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.277 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.278 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.291 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.345 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.346 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.347 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.357 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.411 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.411 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.440 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.441 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.441 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.495 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.496 183284 DEBUG nova.virt.disk.api [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Checking if we can resize image /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.496 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.549 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.550 183284 DEBUG nova.virt.disk.api [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Cannot resize image /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.550 183284 DEBUG nova.objects.instance [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'migration_context' on Instance uuid 4f3aff6b-5152-4498-a9fc-faba398385b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.594 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.594 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Ensure instance console log exists: /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.595 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.595 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.596 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.864 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.864 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:27:24 compute-0 nova_compute[183278]: 2026-01-21 18:27:24.983 183284 DEBUG nova.policy [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41dc6e790bc54fbfaf5c6007d3fa5f63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:27:25 compute-0 nova_compute[183278]: 2026-01-21 18:27:25.411 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:25 compute-0 nova_compute[183278]: 2026-01-21 18:27:25.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:27:25 compute-0 nova_compute[183278]: 2026-01-21 18:27:25.875 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:26 compute-0 nova_compute[183278]: 2026-01-21 18:27:26.961 183284 DEBUG nova.network.neutron [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Successfully created port: 79571f69-76ad-462f-8176-f34f6dab8ddb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:27:27 compute-0 nova_compute[183278]: 2026-01-21 18:27:27.684 183284 DEBUG nova.network.neutron [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Successfully updated port: 79571f69-76ad-462f-8176-f34f6dab8ddb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:27:27 compute-0 nova_compute[183278]: 2026-01-21 18:27:27.698 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:27:27 compute-0 nova_compute[183278]: 2026-01-21 18:27:27.699 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquired lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:27:27 compute-0 nova_compute[183278]: 2026-01-21 18:27:27.699 183284 DEBUG nova.network.neutron [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:27:27 compute-0 nova_compute[183278]: 2026-01-21 18:27:27.773 183284 DEBUG nova.compute.manager [req-ab9cae96-6bf1-4eed-aee6-67dce1cce6f4 req-1d1bc29e-8f15-4116-9d6d-e8325db949ac 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-changed-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:27:27 compute-0 nova_compute[183278]: 2026-01-21 18:27:27.773 183284 DEBUG nova.compute.manager [req-ab9cae96-6bf1-4eed-aee6-67dce1cce6f4 req-1d1bc29e-8f15-4116-9d6d-e8325db949ac 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Refreshing instance network info cache due to event network-changed-79571f69-76ad-462f-8176-f34f6dab8ddb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:27:27 compute-0 nova_compute[183278]: 2026-01-21 18:27:27.773 183284 DEBUG oslo_concurrency.lockutils [req-ab9cae96-6bf1-4eed-aee6-67dce1cce6f4 req-1d1bc29e-8f15-4116-9d6d-e8325db949ac 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:27:27 compute-0 nova_compute[183278]: 2026-01-21 18:27:27.841 183284 DEBUG nova.network.neutron [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:27:27 compute-0 podman[208591]: 2026-01-21 18:27:27.998672564 +0000 UTC m=+0.055079974 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, config_id=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.074 183284 DEBUG nova.network.neutron [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Updating instance_info_cache with network_info: [{"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.119 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Releasing lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.120 183284 DEBUG nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Instance network_info: |[{"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.120 183284 DEBUG oslo_concurrency.lockutils [req-ab9cae96-6bf1-4eed-aee6-67dce1cce6f4 req-1d1bc29e-8f15-4116-9d6d-e8325db949ac 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.121 183284 DEBUG nova.network.neutron [req-ab9cae96-6bf1-4eed-aee6-67dce1cce6f4 req-1d1bc29e-8f15-4116-9d6d-e8325db949ac 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Refreshing network info cache for port 79571f69-76ad-462f-8176-f34f6dab8ddb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.123 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Start _get_guest_xml network_info=[{"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.128 183284 WARNING nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.133 183284 DEBUG nova.virt.libvirt.host [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.133 183284 DEBUG nova.virt.libvirt.host [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.139 183284 DEBUG nova.virt.libvirt.host [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.140 183284 DEBUG nova.virt.libvirt.host [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.141 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.141 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.141 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.141 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.142 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.142 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.142 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.142 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.142 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.143 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.143 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.143 183284 DEBUG nova.virt.hardware [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.146 183284 DEBUG nova.virt.libvirt.vif [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:27:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1799071958',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1799071958',id=15,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-2pe732ao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:27:24Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=4f3aff6b-5152-4498-a9fc-faba398385b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.146 183284 DEBUG nova.network.os_vif_util [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.147 183284 DEBUG nova.network.os_vif_util [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:b5:96,bridge_name='br-int',has_traffic_filtering=True,id=79571f69-76ad-462f-8176-f34f6dab8ddb,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79571f69-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.148 183284 DEBUG nova.objects.instance [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 4f3aff6b-5152-4498-a9fc-faba398385b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.162 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <uuid>4f3aff6b-5152-4498-a9fc-faba398385b8</uuid>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <name>instance-0000000f</name>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteStrategies-server-1799071958</nova:name>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:27:29</nova:creationTime>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:27:29 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:27:29 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:27:29 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:27:29 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:27:29 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:27:29 compute-0 nova_compute[183278]:         <nova:user uuid="41dc6e790bc54fbfaf5c6007d3fa5f63">tempest-TestExecuteStrategies-1753607426-project-member</nova:user>
Jan 21 18:27:29 compute-0 nova_compute[183278]:         <nova:project uuid="fe688847145f4dee992c72dd40bbc1ac">tempest-TestExecuteStrategies-1753607426</nova:project>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:27:29 compute-0 nova_compute[183278]:         <nova:port uuid="79571f69-76ad-462f-8176-f34f6dab8ddb">
Jan 21 18:27:29 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <system>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <entry name="serial">4f3aff6b-5152-4498-a9fc-faba398385b8</entry>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <entry name="uuid">4f3aff6b-5152-4498-a9fc-faba398385b8</entry>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     </system>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <os>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   </os>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <features>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   </features>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk.config"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:5d:b5:96"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <target dev="tap79571f69-76"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/console.log" append="off"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <video>
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     </video>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:27:29 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:27:29 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:27:29 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:27:29 compute-0 nova_compute[183278]: </domain>
Jan 21 18:27:29 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.164 183284 DEBUG nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Preparing to wait for external event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.164 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.164 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.165 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.165 183284 DEBUG nova.virt.libvirt.vif [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:27:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1799071958',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1799071958',id=15,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-2pe732ao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:27:24Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=4f3aff6b-5152-4498-a9fc-faba398385b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.165 183284 DEBUG nova.network.os_vif_util [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.166 183284 DEBUG nova.network.os_vif_util [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:b5:96,bridge_name='br-int',has_traffic_filtering=True,id=79571f69-76ad-462f-8176-f34f6dab8ddb,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79571f69-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.166 183284 DEBUG os_vif [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:b5:96,bridge_name='br-int',has_traffic_filtering=True,id=79571f69-76ad-462f-8176-f34f6dab8ddb,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79571f69-76') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.167 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.167 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.167 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.170 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.170 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79571f69-76, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.170 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap79571f69-76, col_values=(('external_ids', {'iface-id': '79571f69-76ad-462f-8176-f34f6dab8ddb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:b5:96', 'vm-uuid': '4f3aff6b-5152-4498-a9fc-faba398385b8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.172 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:29 compute-0 NetworkManager[55506]: <info>  [1769020049.1729] manager: (tap79571f69-76): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.174 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.178 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.178 183284 INFO os_vif [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:b5:96,bridge_name='br-int',has_traffic_filtering=True,id=79571f69-76ad-462f-8176-f34f6dab8ddb,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79571f69-76')
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.238 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.238 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.238 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No VIF found with MAC fa:16:3e:5d:b5:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:27:29 compute-0 nova_compute[183278]: 2026-01-21 18:27:29.239 183284 INFO nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Using config drive
Jan 21 18:27:29 compute-0 podman[192560]: time="2026-01-21T18:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:27:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:27:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.116 183284 INFO nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Creating config drive at /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk.config
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.121 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4cattfjt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.244 183284 DEBUG oslo_concurrency.processutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4cattfjt" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:27:30 compute-0 kernel: tap79571f69-76: entered promiscuous mode
Jan 21 18:27:30 compute-0 NetworkManager[55506]: <info>  [1769020050.3034] manager: (tap79571f69-76): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Jan 21 18:27:30 compute-0 ovn_controller[95419]: 2026-01-21T18:27:30Z|00116|binding|INFO|Claiming lport 79571f69-76ad-462f-8176-f34f6dab8ddb for this chassis.
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.303 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:30 compute-0 ovn_controller[95419]: 2026-01-21T18:27:30Z|00117|binding|INFO|79571f69-76ad-462f-8176-f34f6dab8ddb: Claiming fa:16:3e:5d:b5:96 10.100.0.13
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.313 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:b5:96 10.100.0.13'], port_security=['fa:16:3e:5d:b5:96 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4f3aff6b-5152-4498-a9fc-faba398385b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=79571f69-76ad-462f-8176-f34f6dab8ddb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.314 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 79571f69-76ad-462f-8176-f34f6dab8ddb in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf bound to our chassis
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.315 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:27:30 compute-0 ovn_controller[95419]: 2026-01-21T18:27:30Z|00118|binding|INFO|Setting lport 79571f69-76ad-462f-8176-f34f6dab8ddb ovn-installed in OVS
Jan 21 18:27:30 compute-0 ovn_controller[95419]: 2026-01-21T18:27:30Z|00119|binding|INFO|Setting lport 79571f69-76ad-462f-8176-f34f6dab8ddb up in Southbound
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.317 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.318 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.321 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.327 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b16267-6d65-4f46-ac5a-fa29a3443354]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.328 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap405ec01b-71 in ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.330 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap405ec01b-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.330 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[54dc18eb-bc27-4d62-9c63-0a3c2d86e710]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.331 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f759d5cd-9fc3-4de4-bc43-79467ebec2ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 systemd-udevd[208635]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:27:30 compute-0 systemd-machined[154592]: New machine qemu-11-instance-0000000f.
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.341 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9818c8-8200-43f0-b056-5dd096caf2a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 NetworkManager[55506]: <info>  [1769020050.3506] device (tap79571f69-76): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:27:30 compute-0 NetworkManager[55506]: <info>  [1769020050.3516] device (tap79571f69-76): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:27:30 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.363 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[cb54debd-7987-4d89-893b-56d60fedfb29]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.391 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[01fa0d16-2586-4f22-987a-d3b027fb98db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 NetworkManager[55506]: <info>  [1769020050.3975] manager: (tap405ec01b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/56)
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.396 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[15f713ba-a35b-4488-9ece-53abed3af129]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.412 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.429 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[2e9f9140-7a95-45f2-9b2f-64cadc3a2c5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.431 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[524ea6ac-5a7a-4f65-a652-3b7f3d1c9ac7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 NetworkManager[55506]: <info>  [1769020050.4487] device (tap405ec01b-70): carrier: link connected
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.453 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[c05cc987-04c1-40e0-8c67-994bfef511a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.466 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[63bb595e-f8be-499c-b4e9-d20094c655f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461405, 'reachable_time': 16115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208666, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.480 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e8043ff7-34cb-4881-ae84-dfc1075862e1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:9502'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461405, 'tstamp': 461405}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208667, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.495 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[41562443-5bda-4b49-a4ac-dd1e75bdd771]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461405, 'reachable_time': 16115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208668, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.522 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[454a6412-a4e3-464a-99b7-cb5953aefd1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.576 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5f38b656-bc3a-4445-a422-e9a86d97559e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.577 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.577 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.578 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.579 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:30 compute-0 kernel: tap405ec01b-70: entered promiscuous mode
Jan 21 18:27:30 compute-0 NetworkManager[55506]: <info>  [1769020050.5801] manager: (tap405ec01b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.581 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.583 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.584 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:30 compute-0 ovn_controller[95419]: 2026-01-21T18:27:30Z|00120|binding|INFO|Releasing lport 9c897ad2-8ce5-4903-8c83-1ed8f117dcdd from this chassis (sb_readonly=0)
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.595 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.596 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.597 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8c8b6d-480f-42d8-9099-b40dee7adb48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.598 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:27:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:27:30.598 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'env', 'PROCESS_TAG=haproxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.636 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020050.636269, 4f3aff6b-5152-4498-a9fc-faba398385b8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.637 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] VM Started (Lifecycle Event)
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.658 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.662 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020050.637009, 4f3aff6b-5152-4498-a9fc-faba398385b8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.662 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] VM Paused (Lifecycle Event)
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.680 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.682 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:27:30 compute-0 nova_compute[183278]: 2026-01-21 18:27:30.702 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:27:31 compute-0 podman[208705]: 2026-01-21 18:27:30.908364141 +0000 UTC m=+0.022904074 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.020 183284 DEBUG nova.compute.manager [req-ae04eacd-a19d-494f-83d8-c9a0b61fa9e6 req-d69d17b9-63af-488a-a588-e15792fbb7ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.021 183284 DEBUG oslo_concurrency.lockutils [req-ae04eacd-a19d-494f-83d8-c9a0b61fa9e6 req-d69d17b9-63af-488a-a588-e15792fbb7ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.021 183284 DEBUG oslo_concurrency.lockutils [req-ae04eacd-a19d-494f-83d8-c9a0b61fa9e6 req-d69d17b9-63af-488a-a588-e15792fbb7ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.021 183284 DEBUG oslo_concurrency.lockutils [req-ae04eacd-a19d-494f-83d8-c9a0b61fa9e6 req-d69d17b9-63af-488a-a588-e15792fbb7ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.022 183284 DEBUG nova.compute.manager [req-ae04eacd-a19d-494f-83d8-c9a0b61fa9e6 req-d69d17b9-63af-488a-a588-e15792fbb7ec 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Processing event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.023 183284 DEBUG nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.026 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020051.025861, 4f3aff6b-5152-4498-a9fc-faba398385b8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.026 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] VM Resumed (Lifecycle Event)
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.028 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.031 183284 INFO nova.virt.libvirt.driver [-] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Instance spawned successfully.
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.031 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.045 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.050 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.053 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.053 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.054 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.054 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.054 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.055 183284 DEBUG nova.virt.libvirt.driver [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.078 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.114 183284 INFO nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Took 6.84 seconds to spawn the instance on the hypervisor.
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.115 183284 DEBUG nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:27:31 compute-0 podman[208705]: 2026-01-21 18:27:31.152189539 +0000 UTC m=+0.266729452 container create 66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.161 183284 DEBUG nova.network.neutron [req-ab9cae96-6bf1-4eed-aee6-67dce1cce6f4 req-1d1bc29e-8f15-4116-9d6d-e8325db949ac 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Updated VIF entry in instance network info cache for port 79571f69-76ad-462f-8176-f34f6dab8ddb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.161 183284 DEBUG nova.network.neutron [req-ab9cae96-6bf1-4eed-aee6-67dce1cce6f4 req-1d1bc29e-8f15-4116-9d6d-e8325db949ac 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Updating instance_info_cache with network_info: [{"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.178 183284 DEBUG oslo_concurrency.lockutils [req-ab9cae96-6bf1-4eed-aee6-67dce1cce6f4 req-1d1bc29e-8f15-4116-9d6d-e8325db949ac 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.181 183284 INFO nova.compute.manager [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Took 7.92 seconds to build instance.
Jan 21 18:27:31 compute-0 systemd[1]: Started libpod-conmon-66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276.scope.
Jan 21 18:27:31 compute-0 nova_compute[183278]: 2026-01-21 18:27:31.198 183284 DEBUG oslo_concurrency.lockutils [None req-8d33ecad-bafe-4138-832a-6eea42943f7b 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:31 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:27:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5fb811a3a232fd61c168e00475670505784f906e548f01048f7dad324faa3d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:27:31 compute-0 podman[208705]: 2026-01-21 18:27:31.235926236 +0000 UTC m=+0.350466169 container init 66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:27:31 compute-0 podman[208705]: 2026-01-21 18:27:31.24188573 +0000 UTC m=+0.356425633 container start 66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:27:31 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208720]: [NOTICE]   (208724) : New worker (208726) forked
Jan 21 18:27:31 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208720]: [NOTICE]   (208724) : Loading success.
Jan 21 18:27:31 compute-0 openstack_network_exporter[195402]: ERROR   18:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:27:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:27:31 compute-0 openstack_network_exporter[195402]: ERROR   18:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:27:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:27:33 compute-0 nova_compute[183278]: 2026-01-21 18:27:33.548 183284 DEBUG nova.compute.manager [req-f44a71b5-b17c-4735-8b7c-b7d43ef03258 req-0548539a-eae4-44b1-a135-1176098266d1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:27:33 compute-0 nova_compute[183278]: 2026-01-21 18:27:33.549 183284 DEBUG oslo_concurrency.lockutils [req-f44a71b5-b17c-4735-8b7c-b7d43ef03258 req-0548539a-eae4-44b1-a135-1176098266d1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:27:33 compute-0 nova_compute[183278]: 2026-01-21 18:27:33.549 183284 DEBUG oslo_concurrency.lockutils [req-f44a71b5-b17c-4735-8b7c-b7d43ef03258 req-0548539a-eae4-44b1-a135-1176098266d1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:27:33 compute-0 nova_compute[183278]: 2026-01-21 18:27:33.549 183284 DEBUG oslo_concurrency.lockutils [req-f44a71b5-b17c-4735-8b7c-b7d43ef03258 req-0548539a-eae4-44b1-a135-1176098266d1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:27:33 compute-0 nova_compute[183278]: 2026-01-21 18:27:33.550 183284 DEBUG nova.compute.manager [req-f44a71b5-b17c-4735-8b7c-b7d43ef03258 req-0548539a-eae4-44b1-a135-1176098266d1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] No waiting events found dispatching network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:27:33 compute-0 nova_compute[183278]: 2026-01-21 18:27:33.550 183284 WARNING nova.compute.manager [req-f44a71b5-b17c-4735-8b7c-b7d43ef03258 req-0548539a-eae4-44b1-a135-1176098266d1 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received unexpected event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb for instance with vm_state active and task_state None.
Jan 21 18:27:34 compute-0 nova_compute[183278]: 2026-01-21 18:27:34.174 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:35 compute-0 podman[208737]: 2026-01-21 18:27:35.029384017 +0000 UTC m=+0.085487150 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 21 18:27:35 compute-0 podman[208736]: 2026-01-21 18:27:35.039904501 +0000 UTC m=+0.095468371 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 18:27:35 compute-0 nova_compute[183278]: 2026-01-21 18:27:35.414 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:39 compute-0 podman[208780]: 2026-01-21 18:27:39.000719759 +0000 UTC m=+0.050048072 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:27:39 compute-0 nova_compute[183278]: 2026-01-21 18:27:39.177 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:40 compute-0 nova_compute[183278]: 2026-01-21 18:27:40.416 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:43 compute-0 ovn_controller[95419]: 2026-01-21T18:27:43Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5d:b5:96 10.100.0.13
Jan 21 18:27:43 compute-0 ovn_controller[95419]: 2026-01-21T18:27:43Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5d:b5:96 10.100.0.13
Jan 21 18:27:44 compute-0 nova_compute[183278]: 2026-01-21 18:27:44.180 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:45 compute-0 nova_compute[183278]: 2026-01-21 18:27:45.417 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:49 compute-0 nova_compute[183278]: 2026-01-21 18:27:49.183 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:50 compute-0 nova_compute[183278]: 2026-01-21 18:27:50.419 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:54 compute-0 nova_compute[183278]: 2026-01-21 18:27:54.185 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:55 compute-0 nova_compute[183278]: 2026-01-21 18:27:55.421 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:59 compute-0 podman[208825]: 2026-01-21 18:27:59.009998576 +0000 UTC m=+0.061100219 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Jan 21 18:27:59 compute-0 nova_compute[183278]: 2026-01-21 18:27:59.187 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:27:59 compute-0 podman[192560]: time="2026-01-21T18:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:27:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:27:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Jan 21 18:28:00 compute-0 nova_compute[183278]: 2026-01-21 18:28:00.423 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:01 compute-0 openstack_network_exporter[195402]: ERROR   18:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:28:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:28:01 compute-0 openstack_network_exporter[195402]: ERROR   18:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:28:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:28:04 compute-0 nova_compute[183278]: 2026-01-21 18:28:04.189 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:05 compute-0 nova_compute[183278]: 2026-01-21 18:28:05.426 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:05 compute-0 podman[208848]: 2026-01-21 18:28:05.999115022 +0000 UTC m=+0.051001846 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 18:28:06 compute-0 podman[208847]: 2026-01-21 18:28:06.020873408 +0000 UTC m=+0.075822106 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 18:28:09 compute-0 nova_compute[183278]: 2026-01-21 18:28:09.192 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:09 compute-0 ovn_controller[95419]: 2026-01-21T18:28:09Z|00121|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 18:28:09 compute-0 podman[208891]: 2026-01-21 18:28:09.99137181 +0000 UTC m=+0.048115804 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:28:10 compute-0 nova_compute[183278]: 2026-01-21 18:28:10.427 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:14 compute-0 nova_compute[183278]: 2026-01-21 18:28:14.195 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:15 compute-0 nova_compute[183278]: 2026-01-21 18:28:15.430 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:16 compute-0 nova_compute[183278]: 2026-01-21 18:28:16.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:28:16 compute-0 nova_compute[183278]: 2026-01-21 18:28:16.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:28:16 compute-0 nova_compute[183278]: 2026-01-21 18:28:16.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:28:17 compute-0 nova_compute[183278]: 2026-01-21 18:28:17.427 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:28:17 compute-0 nova_compute[183278]: 2026-01-21 18:28:17.427 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:28:17 compute-0 nova_compute[183278]: 2026-01-21 18:28:17.428 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:28:17 compute-0 nova_compute[183278]: 2026-01-21 18:28:17.428 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4f3aff6b-5152-4498-a9fc-faba398385b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.197 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.934 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Updating instance_info_cache with network_info: [{"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.953 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.953 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.953 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.953 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.976 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.977 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.977 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:19 compute-0 nova_compute[183278]: 2026-01-21 18:28:19.977 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.038 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:28:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:20.085 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:20.086 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:20.087 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.095 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.096 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.152 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.292 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.293 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5707MB free_disk=73.35280990600586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.293 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.293 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.371 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance 4f3aff6b-5152-4498-a9fc-faba398385b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.371 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.371 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.385 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing inventories for resource provider 502e4243-611b-433d-a766-9b485d51652d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.406 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating ProviderTree inventory for provider 502e4243-611b-433d-a766-9b485d51652d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.407 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.422 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing aggregate associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.431 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.452 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing trait associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.488 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.502 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.522 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:28:20 compute-0 nova_compute[183278]: 2026-01-21 18:28:20.522 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:22 compute-0 nova_compute[183278]: 2026-01-21 18:28:22.385 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:28:22 compute-0 nova_compute[183278]: 2026-01-21 18:28:22.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:28:22 compute-0 nova_compute[183278]: 2026-01-21 18:28:22.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:28:23 compute-0 nova_compute[183278]: 2026-01-21 18:28:23.189 183284 DEBUG nova.compute.manager [None req-1e2b8bc1-e9e3-4456-b041-d724c7c33917 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 502e4243-611b-433d-a766-9b485d51652d in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Jan 21 18:28:23 compute-0 nova_compute[183278]: 2026-01-21 18:28:23.233 183284 DEBUG nova.compute.provider_tree [None req-1e2b8bc1-e9e3-4456-b041-d724c7c33917 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Updating resource provider 502e4243-611b-433d-a766-9b485d51652d generation from 26 to 27 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 18:28:24 compute-0 nova_compute[183278]: 2026-01-21 18:28:24.200 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:25 compute-0 nova_compute[183278]: 2026-01-21 18:28:25.432 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:25 compute-0 nova_compute[183278]: 2026-01-21 18:28:25.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:28:26 compute-0 nova_compute[183278]: 2026-01-21 18:28:26.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:28:26 compute-0 nova_compute[183278]: 2026-01-21 18:28:26.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:28:27 compute-0 nova_compute[183278]: 2026-01-21 18:28:27.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:28:29 compute-0 nova_compute[183278]: 2026-01-21 18:28:29.202 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:29 compute-0 nova_compute[183278]: 2026-01-21 18:28:29.521 183284 DEBUG nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Check if temp file /var/lib/nova/instances/tmpcc494bfv exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 18:28:29 compute-0 nova_compute[183278]: 2026-01-21 18:28:29.522 183284 DEBUG nova.compute.manager [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcc494bfv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4f3aff6b-5152-4498-a9fc-faba398385b8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 18:28:29 compute-0 podman[192560]: time="2026-01-21T18:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:28:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:28:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Jan 21 18:28:30 compute-0 podman[208931]: 2026-01-21 18:28:30.001381585 +0000 UTC m=+0.059395147 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc.)
Jan 21 18:28:30 compute-0 nova_compute[183278]: 2026-01-21 18:28:30.433 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:30 compute-0 nova_compute[183278]: 2026-01-21 18:28:30.483 183284 DEBUG oslo_concurrency.processutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:28:30 compute-0 nova_compute[183278]: 2026-01-21 18:28:30.543 183284 DEBUG oslo_concurrency.processutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:28:30 compute-0 nova_compute[183278]: 2026-01-21 18:28:30.545 183284 DEBUG oslo_concurrency.processutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:28:30 compute-0 nova_compute[183278]: 2026-01-21 18:28:30.606 183284 DEBUG oslo_concurrency.processutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:28:31 compute-0 openstack_network_exporter[195402]: ERROR   18:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:28:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:28:31 compute-0 openstack_network_exporter[195402]: ERROR   18:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:28:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:28:32 compute-0 sshd-session[208959]: Accepted publickey for nova from 192.168.122.101 port 42726 ssh2: ECDSA SHA256:29a5JNhHHz2bb0ACqZTr6qOKeSRnhiTRA8SK+rzn9gs
Jan 21 18:28:32 compute-0 systemd-logind[782]: New session 34 of user nova.
Jan 21 18:28:32 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:28:32 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:28:32 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:28:32 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:28:32 compute-0 systemd[208963]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:28:32 compute-0 systemd[208963]: Queued start job for default target Main User Target.
Jan 21 18:28:32 compute-0 systemd[208963]: Created slice User Application Slice.
Jan 21 18:28:32 compute-0 systemd[208963]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:28:32 compute-0 systemd[208963]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:28:32 compute-0 systemd[208963]: Reached target Paths.
Jan 21 18:28:32 compute-0 systemd[208963]: Reached target Timers.
Jan 21 18:28:32 compute-0 systemd[208963]: Starting D-Bus User Message Bus Socket...
Jan 21 18:28:32 compute-0 systemd[208963]: Starting Create User's Volatile Files and Directories...
Jan 21 18:28:32 compute-0 systemd[208963]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:28:32 compute-0 systemd[208963]: Reached target Sockets.
Jan 21 18:28:32 compute-0 systemd[208963]: Finished Create User's Volatile Files and Directories.
Jan 21 18:28:32 compute-0 systemd[208963]: Reached target Basic System.
Jan 21 18:28:32 compute-0 systemd[208963]: Reached target Main User Target.
Jan 21 18:28:32 compute-0 systemd[208963]: Startup finished in 121ms.
Jan 21 18:28:32 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:28:32 compute-0 systemd[1]: Started Session 34 of User nova.
Jan 21 18:28:32 compute-0 sshd-session[208959]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:28:32 compute-0 sshd-session[208978]: Received disconnect from 192.168.122.101 port 42726:11: disconnected by user
Jan 21 18:28:32 compute-0 sshd-session[208978]: Disconnected from user nova 192.168.122.101 port 42726
Jan 21 18:28:32 compute-0 sshd-session[208959]: pam_unix(sshd:session): session closed for user nova
Jan 21 18:28:32 compute-0 systemd-logind[782]: Session 34 logged out. Waiting for processes to exit.
Jan 21 18:28:32 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Jan 21 18:28:32 compute-0 systemd-logind[782]: Removed session 34.
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.188 183284 DEBUG nova.compute.manager [req-ca9f2ea8-1862-4e59-a9a1-abd418f3a7e6 req-d653982e-2f72-4362-b634-32a48042fe64 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-unplugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.190 183284 DEBUG oslo_concurrency.lockutils [req-ca9f2ea8-1862-4e59-a9a1-abd418f3a7e6 req-d653982e-2f72-4362-b634-32a48042fe64 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.190 183284 DEBUG oslo_concurrency.lockutils [req-ca9f2ea8-1862-4e59-a9a1-abd418f3a7e6 req-d653982e-2f72-4362-b634-32a48042fe64 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.190 183284 DEBUG oslo_concurrency.lockutils [req-ca9f2ea8-1862-4e59-a9a1-abd418f3a7e6 req-d653982e-2f72-4362-b634-32a48042fe64 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.191 183284 DEBUG nova.compute.manager [req-ca9f2ea8-1862-4e59-a9a1-abd418f3a7e6 req-d653982e-2f72-4362-b634-32a48042fe64 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] No waiting events found dispatching network-vif-unplugged-79571f69-76ad-462f-8176-f34f6dab8ddb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.191 183284 DEBUG nova.compute.manager [req-ca9f2ea8-1862-4e59-a9a1-abd418f3a7e6 req-d653982e-2f72-4362-b634-32a48042fe64 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-unplugged-79571f69-76ad-462f-8176-f34f6dab8ddb for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:28:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:33.218 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.218 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:33.219 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.913 183284 INFO nova.compute.manager [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Took 3.30 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.913 183284 DEBUG nova.compute.manager [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.929 183284 DEBUG nova.compute.manager [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcc494bfv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4f3aff6b-5152-4498-a9fc-faba398385b8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(52a63dd4-955f-4822-86b3-7a70b1a105d5),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.949 183284 DEBUG nova.objects.instance [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid 4f3aff6b-5152-4498-a9fc-faba398385b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.951 183284 DEBUG nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.952 183284 DEBUG nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.952 183284 DEBUG nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.968 183284 DEBUG nova.virt.libvirt.vif [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:27:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1799071958',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1799071958',id=15,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:27:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-2pe732ao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:27:31Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=4f3aff6b-5152-4498-a9fc-faba398385b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.969 183284 DEBUG nova.network.os_vif_util [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.969 183284 DEBUG nova.network.os_vif_util [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:b5:96,bridge_name='br-int',has_traffic_filtering=True,id=79571f69-76ad-462f-8176-f34f6dab8ddb,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79571f69-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.970 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 18:28:33 compute-0 nova_compute[183278]:   <mac address="fa:16:3e:5d:b5:96"/>
Jan 21 18:28:33 compute-0 nova_compute[183278]:   <model type="virtio"/>
Jan 21 18:28:33 compute-0 nova_compute[183278]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:28:33 compute-0 nova_compute[183278]:   <mtu size="1442"/>
Jan 21 18:28:33 compute-0 nova_compute[183278]:   <target dev="tap79571f69-76"/>
Jan 21 18:28:33 compute-0 nova_compute[183278]: </interface>
Jan 21 18:28:33 compute-0 nova_compute[183278]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 18:28:33 compute-0 nova_compute[183278]: 2026-01-21 18:28:33.970 183284 DEBUG nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 18:28:34 compute-0 nova_compute[183278]: 2026-01-21 18:28:34.204 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:34 compute-0 nova_compute[183278]: 2026-01-21 18:28:34.455 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:28:34 compute-0 nova_compute[183278]: 2026-01-21 18:28:34.455 183284 INFO nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 18:28:34 compute-0 nova_compute[183278]: 2026-01-21 18:28:34.561 183284 INFO nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.063 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.064 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.283 183284 DEBUG nova.compute.manager [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.284 183284 DEBUG oslo_concurrency.lockutils [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.285 183284 DEBUG oslo_concurrency.lockutils [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.285 183284 DEBUG oslo_concurrency.lockutils [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.286 183284 DEBUG nova.compute.manager [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] No waiting events found dispatching network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.286 183284 WARNING nova.compute.manager [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received unexpected event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb for instance with vm_state active and task_state migrating.
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.287 183284 DEBUG nova.compute.manager [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-changed-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.287 183284 DEBUG nova.compute.manager [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Refreshing instance network info cache due to event network-changed-79571f69-76ad-462f-8176-f34f6dab8ddb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.288 183284 DEBUG oslo_concurrency.lockutils [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.288 183284 DEBUG oslo_concurrency.lockutils [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.289 183284 DEBUG nova.network.neutron [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Refreshing network info cache for port 79571f69-76ad-462f-8176-f34f6dab8ddb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.434 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.567 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:28:35 compute-0 nova_compute[183278]: 2026-01-21 18:28:35.567 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.074 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.074 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.458 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020116.4577718, 4f3aff6b-5152-4498-a9fc-faba398385b8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.458 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] VM Paused (Lifecycle Event)
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.552 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.556 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.578 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.578 183284 DEBUG nova.virt.libvirt.migration [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:28:36 compute-0 kernel: tap79571f69-76 (unregistering): left promiscuous mode
Jan 21 18:28:36 compute-0 NetworkManager[55506]: <info>  [1769020116.6105] device (tap79571f69-76): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.628 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:36 compute-0 ovn_controller[95419]: 2026-01-21T18:28:36Z|00122|binding|INFO|Releasing lport 79571f69-76ad-462f-8176-f34f6dab8ddb from this chassis (sb_readonly=0)
Jan 21 18:28:36 compute-0 ovn_controller[95419]: 2026-01-21T18:28:36Z|00123|binding|INFO|Setting lport 79571f69-76ad-462f-8176-f34f6dab8ddb down in Southbound
Jan 21 18:28:36 compute-0 ovn_controller[95419]: 2026-01-21T18:28:36Z|00124|binding|INFO|Removing iface tap79571f69-76 ovn-installed in OVS
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.631 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.645 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:36 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 21 18:28:36 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 15.015s CPU time.
Jan 21 18:28:36 compute-0 systemd-machined[154592]: Machine qemu-11-instance-0000000f terminated.
Jan 21 18:28:36 compute-0 podman[208988]: 2026-01-21 18:28:36.708832346 +0000 UTC m=+0.062186286 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.767 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 18:28:36 compute-0 podman[208985]: 2026-01-21 18:28:36.77472353 +0000 UTC m=+0.129605457 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 18:28:36 compute-0 ovn_controller[95419]: 2026-01-21T18:28:36Z|00125|binding|INFO|Releasing lport 9c897ad2-8ce5-4903-8c83-1ed8f117dcdd from this chassis (sb_readonly=0)
Jan 21 18:28:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:36.823 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:b5:96 10.100.0.13'], port_security=['fa:16:3e:5d:b5:96 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '88a62794-b4a4-47e3-9cce-91e574e684c1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4f3aff6b-5152-4498-a9fc-faba398385b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '8', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=79571f69-76ad-462f-8176-f34f6dab8ddb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:28:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:36.824 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 79571f69-76ad-462f-8176-f34f6dab8ddb in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf unbound from our chassis
Jan 21 18:28:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:36.826 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:28:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:36.827 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb098c8-4b42-415b-9c90-6fc8e6b337ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:28:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:36.828 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace which is not needed anymore
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.857 183284 DEBUG nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.857 183284 DEBUG nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.858 183284 DEBUG nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 18:28:36 compute-0 nova_compute[183278]: 2026-01-21 18:28:36.863 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:36 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208720]: [NOTICE]   (208724) : haproxy version is 2.8.14-c23fe91
Jan 21 18:28:36 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208720]: [NOTICE]   (208724) : path to executable is /usr/sbin/haproxy
Jan 21 18:28:36 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208720]: [WARNING]  (208724) : Exiting Master process...
Jan 21 18:28:36 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208720]: [ALERT]    (208724) : Current worker (208726) exited with code 143 (Terminated)
Jan 21 18:28:36 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[208720]: [WARNING]  (208724) : All workers exited. Exiting... (0)
Jan 21 18:28:36 compute-0 systemd[1]: libpod-66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276.scope: Deactivated successfully.
Jan 21 18:28:36 compute-0 podman[209070]: 2026-01-21 18:28:36.963195341 +0000 UTC m=+0.048876184 container died 66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 18:28:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276-userdata-shm.mount: Deactivated successfully.
Jan 21 18:28:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-a5fb811a3a232fd61c168e00475670505784f906e548f01048f7dad324faa3d9-merged.mount: Deactivated successfully.
Jan 21 18:28:37 compute-0 podman[209070]: 2026-01-21 18:28:37.003162477 +0000 UTC m=+0.088843310 container cleanup 66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:28:37 compute-0 systemd[1]: libpod-conmon-66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276.scope: Deactivated successfully.
Jan 21 18:28:37 compute-0 podman[209102]: 2026-01-21 18:28:37.067487994 +0000 UTC m=+0.043479363 container remove 66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 21 18:28:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:37.072 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e5780c19-82b6-4501-9ad8-7f195cf91402]: (4, ('Wed Jan 21 06:28:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276)\n66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276\nWed Jan 21 06:28:37 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276)\n66acd98b066ec74a4f024834b47a48ce7c3883a5f07291b01036be9258827276\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:28:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:37.074 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[cc66cfa1-e38c-4d56-bb35-ba7869993c4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:28:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:37.075 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.076 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:37 compute-0 kernel: tap405ec01b-70: left promiscuous mode
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.081 183284 DEBUG nova.virt.libvirt.guest [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '4f3aff6b-5152-4498-a9fc-faba398385b8' (instance-0000000f) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.081 183284 INFO nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Migration operation has completed
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.082 183284 INFO nova.compute.manager [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] _post_live_migration() is started..
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.091 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:37.094 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[af9a0283-89a7-4897-a755-fad75c76cfc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.100 183284 DEBUG nova.network.neutron [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Updated VIF entry in instance network info cache for port 79571f69-76ad-462f-8176-f34f6dab8ddb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.101 183284 DEBUG nova.network.neutron [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Updating instance_info_cache with network_info: [{"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:28:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:37.115 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e73d39-5b23-46a8-9c60-a5c02c79b8cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:28:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:37.116 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[346f2831-eb87-486c-8d24-a33bb0d9773c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:28:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:37.131 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[bc6c3f5b-4512-4fb9-965a-9d7b59c7f753]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461399, 'reachable_time': 26599, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209120, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:28:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d405ec01b\x2d76d3\x2d4c3c\x2da31b\x2d5f16d9641fbf.mount: Deactivated successfully.
Jan 21 18:28:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:37.134 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:28:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:37.135 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[b370793e-62ce-4e13-a448-cfdaca2992c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.253 183284 DEBUG nova.compute.manager [req-f6c91f54-a01f-41ed-bf00-b1d9efe456b2 req-e5987ce1-b832-4069-9e42-4362b9ff5aa7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-unplugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.253 183284 DEBUG oslo_concurrency.lockutils [req-f6c91f54-a01f-41ed-bf00-b1d9efe456b2 req-e5987ce1-b832-4069-9e42-4362b9ff5aa7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.253 183284 DEBUG oslo_concurrency.lockutils [req-f6c91f54-a01f-41ed-bf00-b1d9efe456b2 req-e5987ce1-b832-4069-9e42-4362b9ff5aa7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.254 183284 DEBUG oslo_concurrency.lockutils [req-f6c91f54-a01f-41ed-bf00-b1d9efe456b2 req-e5987ce1-b832-4069-9e42-4362b9ff5aa7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.254 183284 DEBUG nova.compute.manager [req-f6c91f54-a01f-41ed-bf00-b1d9efe456b2 req-e5987ce1-b832-4069-9e42-4362b9ff5aa7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] No waiting events found dispatching network-vif-unplugged-79571f69-76ad-462f-8176-f34f6dab8ddb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.254 183284 DEBUG nova.compute.manager [req-f6c91f54-a01f-41ed-bf00-b1d9efe456b2 req-e5987ce1-b832-4069-9e42-4362b9ff5aa7 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-unplugged-79571f69-76ad-462f-8176-f34f6dab8ddb for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.255 183284 DEBUG oslo_concurrency.lockutils [req-19507d78-75f4-4f63-b5c1-d2cc8ede9c4f req-2825b028-52af-4d6c-934d-3bcad0627b5a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-4f3aff6b-5152-4498-a9fc-faba398385b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.715 183284 DEBUG nova.compute.manager [req-8ec7868d-ae6f-48e8-a736-8dfef4b37462 req-7a15766a-6260-4220-8691-82be8bb3582e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-unplugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.715 183284 DEBUG oslo_concurrency.lockutils [req-8ec7868d-ae6f-48e8-a736-8dfef4b37462 req-7a15766a-6260-4220-8691-82be8bb3582e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.716 183284 DEBUG oslo_concurrency.lockutils [req-8ec7868d-ae6f-48e8-a736-8dfef4b37462 req-7a15766a-6260-4220-8691-82be8bb3582e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.716 183284 DEBUG oslo_concurrency.lockutils [req-8ec7868d-ae6f-48e8-a736-8dfef4b37462 req-7a15766a-6260-4220-8691-82be8bb3582e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.716 183284 DEBUG nova.compute.manager [req-8ec7868d-ae6f-48e8-a736-8dfef4b37462 req-7a15766a-6260-4220-8691-82be8bb3582e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] No waiting events found dispatching network-vif-unplugged-79571f69-76ad-462f-8176-f34f6dab8ddb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:28:37 compute-0 nova_compute[183278]: 2026-01-21 18:28:37.716 183284 DEBUG nova.compute.manager [req-8ec7868d-ae6f-48e8-a736-8dfef4b37462 req-7a15766a-6260-4220-8691-82be8bb3582e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-unplugged-79571f69-76ad-462f-8176-f34f6dab8ddb for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.260 183284 DEBUG nova.network.neutron [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Activated binding for port 79571f69-76ad-462f-8176-f34f6dab8ddb and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.261 183284 DEBUG nova.compute.manager [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.262 183284 DEBUG nova.virt.libvirt.vif [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:27:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1799071958',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1799071958',id=15,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:27:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-2pe732ao',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:28:27Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=4f3aff6b-5152-4498-a9fc-faba398385b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.262 183284 DEBUG nova.network.os_vif_util [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "79571f69-76ad-462f-8176-f34f6dab8ddb", "address": "fa:16:3e:5d:b5:96", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79571f69-76", "ovs_interfaceid": "79571f69-76ad-462f-8176-f34f6dab8ddb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.263 183284 DEBUG nova.network.os_vif_util [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:b5:96,bridge_name='br-int',has_traffic_filtering=True,id=79571f69-76ad-462f-8176-f34f6dab8ddb,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79571f69-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.263 183284 DEBUG os_vif [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:b5:96,bridge_name='br-int',has_traffic_filtering=True,id=79571f69-76ad-462f-8176-f34f6dab8ddb,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79571f69-76') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.265 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.265 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79571f69-76, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.266 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.267 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.269 183284 INFO os_vif [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:b5:96,bridge_name='br-int',has_traffic_filtering=True,id=79571f69-76ad-462f-8176-f34f6dab8ddb,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79571f69-76')
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.270 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.270 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.270 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.271 183284 DEBUG nova.compute.manager [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.271 183284 INFO nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Deleting instance files /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8_del
Jan 21 18:28:38 compute-0 nova_compute[183278]: 2026-01-21 18:28:38.272 183284 INFO nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Deletion of /var/lib/nova/instances/4f3aff6b-5152-4498-a9fc-faba398385b8_del complete
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.333 183284 DEBUG nova.compute.manager [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.333 183284 DEBUG oslo_concurrency.lockutils [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.334 183284 DEBUG oslo_concurrency.lockutils [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.334 183284 DEBUG oslo_concurrency.lockutils [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.334 183284 DEBUG nova.compute.manager [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] No waiting events found dispatching network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.334 183284 WARNING nova.compute.manager [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received unexpected event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb for instance with vm_state active and task_state migrating.
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.335 183284 DEBUG nova.compute.manager [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.335 183284 DEBUG oslo_concurrency.lockutils [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.335 183284 DEBUG oslo_concurrency.lockutils [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.335 183284 DEBUG oslo_concurrency.lockutils [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.335 183284 DEBUG nova.compute.manager [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] No waiting events found dispatching network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.336 183284 WARNING nova.compute.manager [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received unexpected event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb for instance with vm_state active and task_state migrating.
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.336 183284 DEBUG nova.compute.manager [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.336 183284 DEBUG oslo_concurrency.lockutils [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.336 183284 DEBUG oslo_concurrency.lockutils [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.337 183284 DEBUG oslo_concurrency.lockutils [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.337 183284 DEBUG nova.compute.manager [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] No waiting events found dispatching network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.337 183284 WARNING nova.compute.manager [req-66bbf357-bc2a-41df-b27b-c5f57fd8458e req-ae87d4bf-cad4-4037-aebb-a87466ecac04 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received unexpected event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb for instance with vm_state active and task_state migrating.
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.786 183284 DEBUG nova.compute.manager [req-8262ad89-c964-4a5d-9a10-106f29d99ee7 req-d7808893-14e0-4b5f-a441-3fbc29544f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.786 183284 DEBUG oslo_concurrency.lockutils [req-8262ad89-c964-4a5d-9a10-106f29d99ee7 req-d7808893-14e0-4b5f-a441-3fbc29544f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.786 183284 DEBUG oslo_concurrency.lockutils [req-8262ad89-c964-4a5d-9a10-106f29d99ee7 req-d7808893-14e0-4b5f-a441-3fbc29544f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.787 183284 DEBUG oslo_concurrency.lockutils [req-8262ad89-c964-4a5d-9a10-106f29d99ee7 req-d7808893-14e0-4b5f-a441-3fbc29544f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.787 183284 DEBUG nova.compute.manager [req-8262ad89-c964-4a5d-9a10-106f29d99ee7 req-d7808893-14e0-4b5f-a441-3fbc29544f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] No waiting events found dispatching network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:28:39 compute-0 nova_compute[183278]: 2026-01-21 18:28:39.787 183284 WARNING nova.compute.manager [req-8262ad89-c964-4a5d-9a10-106f29d99ee7 req-d7808893-14e0-4b5f-a441-3fbc29544f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Received unexpected event network-vif-plugged-79571f69-76ad-462f-8176-f34f6dab8ddb for instance with vm_state active and task_state migrating.
Jan 21 18:28:40 compute-0 nova_compute[183278]: 2026-01-21 18:28:40.436 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:40 compute-0 podman[209121]: 2026-01-21 18:28:40.988336594 +0000 UTC m=+0.047200943 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:28:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:28:41.220 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.086 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.086 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.086 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4f3aff6b-5152-4498-a9fc-faba398385b8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.108 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.109 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.109 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.109 183284 DEBUG nova.compute.resource_tracker [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.283 183284 WARNING nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.284 183284 DEBUG nova.compute.resource_tracker [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5808MB free_disk=73.38151168823242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.285 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.285 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.334 183284 DEBUG nova.compute.resource_tracker [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration for instance 4f3aff6b-5152-4498-a9fc-faba398385b8 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.355 183284 DEBUG nova.compute.resource_tracker [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.383 183284 DEBUG nova.compute.resource_tracker [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration 52a63dd4-955f-4822-86b3-7a70b1a105d5 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.383 183284 DEBUG nova.compute.resource_tracker [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.384 183284 DEBUG nova.compute.resource_tracker [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:28:42 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:28:42 compute-0 systemd[208963]: Activating special unit Exit the Session...
Jan 21 18:28:42 compute-0 systemd[208963]: Stopped target Main User Target.
Jan 21 18:28:42 compute-0 systemd[208963]: Stopped target Basic System.
Jan 21 18:28:42 compute-0 systemd[208963]: Stopped target Paths.
Jan 21 18:28:42 compute-0 systemd[208963]: Stopped target Sockets.
Jan 21 18:28:42 compute-0 systemd[208963]: Stopped target Timers.
Jan 21 18:28:42 compute-0 systemd[208963]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:28:42 compute-0 systemd[208963]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:28:42 compute-0 systemd[208963]: Closed D-Bus User Message Bus Socket.
Jan 21 18:28:42 compute-0 systemd[208963]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:28:42 compute-0 systemd[208963]: Removed slice User Application Slice.
Jan 21 18:28:42 compute-0 systemd[208963]: Reached target Shutdown.
Jan 21 18:28:42 compute-0 systemd[208963]: Finished Exit the Session.
Jan 21 18:28:42 compute-0 systemd[208963]: Reached target Exit the Session.
Jan 21 18:28:42 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:28:42 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.430 183284 DEBUG nova.compute.provider_tree [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:28:42 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:28:42 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:28:42 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:28:42 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:28:42 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.472 183284 DEBUG nova.scheduler.client.report [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.491 183284 DEBUG nova.compute.resource_tracker [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.491 183284 DEBUG oslo_concurrency.lockutils [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.496 183284 INFO nova.compute.manager [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.565 183284 INFO nova.scheduler.client.report [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Deleted allocation for migration 52a63dd4-955f-4822-86b3-7a70b1a105d5
Jan 21 18:28:42 compute-0 nova_compute[183278]: 2026-01-21 18:28:42.565 183284 DEBUG nova.virt.libvirt.driver [None req-e898b704-760b-4775-a596-907a3cb3c036 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 18:28:43 compute-0 nova_compute[183278]: 2026-01-21 18:28:43.323 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:44 compute-0 nova_compute[183278]: 2026-01-21 18:28:44.138 183284 DEBUG nova.compute.manager [None req-5bb1325c-d907-46b5-b106-8c56db2def09 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 502e4243-611b-433d-a766-9b485d51652d in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Jan 21 18:28:44 compute-0 nova_compute[183278]: 2026-01-21 18:28:44.191 183284 DEBUG nova.compute.provider_tree [None req-5bb1325c-d907-46b5-b106-8c56db2def09 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Updating resource provider 502e4243-611b-433d-a766-9b485d51652d generation from 27 to 30 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 18:28:45 compute-0 nova_compute[183278]: 2026-01-21 18:28:45.479 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:48 compute-0 nova_compute[183278]: 2026-01-21 18:28:48.326 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:50 compute-0 nova_compute[183278]: 2026-01-21 18:28:50.481 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:51 compute-0 nova_compute[183278]: 2026-01-21 18:28:51.856 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020116.855212, 4f3aff6b-5152-4498-a9fc-faba398385b8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:28:51 compute-0 nova_compute[183278]: 2026-01-21 18:28:51.857 183284 INFO nova.compute.manager [-] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] VM Stopped (Lifecycle Event)
Jan 21 18:28:51 compute-0 nova_compute[183278]: 2026-01-21 18:28:51.876 183284 DEBUG nova.compute.manager [None req-ca5a6637-e64f-421e-85c7-809d5a8aa6de - - - - - -] [instance: 4f3aff6b-5152-4498-a9fc-faba398385b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:28:53 compute-0 nova_compute[183278]: 2026-01-21 18:28:53.327 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:55 compute-0 nova_compute[183278]: 2026-01-21 18:28:55.483 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:58 compute-0 nova_compute[183278]: 2026-01-21 18:28:58.337 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:28:59 compute-0 podman[192560]: time="2026-01-21T18:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:28:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:28:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Jan 21 18:29:00 compute-0 nova_compute[183278]: 2026-01-21 18:29:00.486 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:01 compute-0 podman[209148]: 2026-01-21 18:29:01.028599282 +0000 UTC m=+0.084179998 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 18:29:01 compute-0 openstack_network_exporter[195402]: ERROR   18:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:29:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:29:01 compute-0 openstack_network_exporter[195402]: ERROR   18:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:29:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:29:03 compute-0 nova_compute[183278]: 2026-01-21 18:29:03.339 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:05 compute-0 nova_compute[183278]: 2026-01-21 18:29:05.487 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:06 compute-0 podman[209172]: 2026-01-21 18:29:06.996047072 +0000 UTC m=+0.053779110 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent)
Jan 21 18:29:07 compute-0 podman[209171]: 2026-01-21 18:29:07.023755042 +0000 UTC m=+0.082180637 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:29:07 compute-0 sshd-session[209215]: Invalid user ansilbe_user from 64.227.98.100 port 44854
Jan 21 18:29:07 compute-0 sshd-session[209215]: Connection closed by invalid user ansilbe_user 64.227.98.100 port 44854 [preauth]
Jan 21 18:29:08 compute-0 nova_compute[183278]: 2026-01-21 18:29:08.341 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:10 compute-0 nova_compute[183278]: 2026-01-21 18:29:10.490 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:11 compute-0 podman[209217]: 2026-01-21 18:29:11.984624496 +0000 UTC m=+0.045022379 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:29:13 compute-0 nova_compute[183278]: 2026-01-21 18:29:13.343 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:15 compute-0 nova_compute[183278]: 2026-01-21 18:29:15.492 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:17 compute-0 nova_compute[183278]: 2026-01-21 18:29:17.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:17 compute-0 nova_compute[183278]: 2026-01-21 18:29:17.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:29:17 compute-0 nova_compute[183278]: 2026-01-21 18:29:17.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:29:17 compute-0 nova_compute[183278]: 2026-01-21 18:29:17.833 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:29:18 compute-0 nova_compute[183278]: 2026-01-21 18:29:18.422 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:19 compute-0 ovn_controller[95419]: 2026-01-21T18:29:19Z|00126|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.852 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.853 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.854 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.854 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.984 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.985 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5835MB free_disk=73.38151168823242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.986 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:19 compute-0 nova_compute[183278]: 2026-01-21 18:29:19.986 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:20 compute-0 nova_compute[183278]: 2026-01-21 18:29:20.047 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:29:20 compute-0 nova_compute[183278]: 2026-01-21 18:29:20.048 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:29:20 compute-0 nova_compute[183278]: 2026-01-21 18:29:20.066 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:29:20 compute-0 nova_compute[183278]: 2026-01-21 18:29:20.079 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:29:20 compute-0 nova_compute[183278]: 2026-01-21 18:29:20.080 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:29:20 compute-0 nova_compute[183278]: 2026-01-21 18:29:20.080 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:20.087 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:20.087 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:20.088 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:20 compute-0 nova_compute[183278]: 2026-01-21 18:29:20.765 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:23 compute-0 nova_compute[183278]: 2026-01-21 18:29:23.081 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:23 compute-0 nova_compute[183278]: 2026-01-21 18:29:23.423 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:23 compute-0 nova_compute[183278]: 2026-01-21 18:29:23.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:23 compute-0 nova_compute[183278]: 2026-01-21 18:29:23.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.389 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.390 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.489 183284 DEBUG nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.768 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.791 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.791 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.796 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.797 183284 INFO nova.compute.claims [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.908 183284 DEBUG nova.compute.provider_tree [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.922 183284 DEBUG nova.scheduler.client.report [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.942 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.943 183284 DEBUG nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.983 183284 DEBUG nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:29:25 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.983 183284 DEBUG nova.network.neutron [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:25.999 183284 INFO nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.018 183284 DEBUG nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.105 183284 DEBUG nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.107 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.107 183284 INFO nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Creating image(s)
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.108 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "/var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.108 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.108 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.120 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.176 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.178 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.178 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.189 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.245 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.247 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.281 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.281 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.282 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.335 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.336 183284 DEBUG nova.virt.disk.api [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Checking if we can resize image /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.337 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.396 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.397 183284 DEBUG nova.virt.disk.api [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Cannot resize image /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.398 183284 DEBUG nova.objects.instance [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'migration_context' on Instance uuid 0df2cec9-7a16-40a4-96bf-c39f8782df91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.414 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.414 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Ensure instance console log exists: /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.415 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.415 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.415 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:26 compute-0 nova_compute[183278]: 2026-01-21 18:29:26.940 183284 DEBUG nova.policy [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41dc6e790bc54fbfaf5c6007d3fa5f63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:29:27 compute-0 nova_compute[183278]: 2026-01-21 18:29:27.584 183284 DEBUG nova.network.neutron [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Successfully created port: 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:29:27 compute-0 nova_compute[183278]: 2026-01-21 18:29:27.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:27 compute-0 nova_compute[183278]: 2026-01-21 18:29:27.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:29:27 compute-0 nova_compute[183278]: 2026-01-21 18:29:27.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:29:28 compute-0 nova_compute[183278]: 2026-01-21 18:29:28.217 183284 DEBUG nova.network.neutron [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Successfully updated port: 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:29:28 compute-0 nova_compute[183278]: 2026-01-21 18:29:28.231 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:29:28 compute-0 nova_compute[183278]: 2026-01-21 18:29:28.232 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquired lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:29:28 compute-0 nova_compute[183278]: 2026-01-21 18:29:28.232 183284 DEBUG nova.network.neutron [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:29:28 compute-0 nova_compute[183278]: 2026-01-21 18:29:28.302 183284 DEBUG nova.compute.manager [req-3506cead-719c-4ebe-892e-5a448818f377 req-36f9022f-77b5-4570-8fb7-dc5082259eaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-changed-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:29:28 compute-0 nova_compute[183278]: 2026-01-21 18:29:28.303 183284 DEBUG nova.compute.manager [req-3506cead-719c-4ebe-892e-5a448818f377 req-36f9022f-77b5-4570-8fb7-dc5082259eaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Refreshing instance network info cache due to event network-changed-5acb086b-a54d-4c3a-a7af-bea2feeff1b4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:29:28 compute-0 nova_compute[183278]: 2026-01-21 18:29:28.303 183284 DEBUG oslo_concurrency.lockutils [req-3506cead-719c-4ebe-892e-5a448818f377 req-36f9022f-77b5-4570-8fb7-dc5082259eaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:29:28 compute-0 nova_compute[183278]: 2026-01-21 18:29:28.424 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:28 compute-0 nova_compute[183278]: 2026-01-21 18:29:28.902 183284 DEBUG nova.network.neutron [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.470 183284 DEBUG nova.network.neutron [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Updating instance_info_cache with network_info: [{"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.486 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Releasing lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.486 183284 DEBUG nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Instance network_info: |[{"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.487 183284 DEBUG oslo_concurrency.lockutils [req-3506cead-719c-4ebe-892e-5a448818f377 req-36f9022f-77b5-4570-8fb7-dc5082259eaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.487 183284 DEBUG nova.network.neutron [req-3506cead-719c-4ebe-892e-5a448818f377 req-36f9022f-77b5-4570-8fb7-dc5082259eaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Refreshing network info cache for port 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.489 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Start _get_guest_xml network_info=[{"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.493 183284 WARNING nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.497 183284 DEBUG nova.virt.libvirt.host [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.497 183284 DEBUG nova.virt.libvirt.host [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.499 183284 DEBUG nova.virt.libvirt.host [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.500 183284 DEBUG nova.virt.libvirt.host [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.501 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.501 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.501 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.502 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.502 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.502 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.503 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.503 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.503 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.503 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.503 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.504 183284 DEBUG nova.virt.hardware [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.507 183284 DEBUG nova.virt.libvirt.vif [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:29:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1136199599',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1136199599',id=17,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-f59f2q57',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:29:26Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=0df2cec9-7a16-40a4-96bf-c39f8782df91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.508 183284 DEBUG nova.network.os_vif_util [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.508 183284 DEBUG nova.network.os_vif_util [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ae:e1,bridge_name='br-int',has_traffic_filtering=True,id=5acb086b-a54d-4c3a-a7af-bea2feeff1b4,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5acb086b-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.509 183284 DEBUG nova.objects.instance [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 0df2cec9-7a16-40a4-96bf-c39f8782df91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.522 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <uuid>0df2cec9-7a16-40a4-96bf-c39f8782df91</uuid>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <name>instance-00000011</name>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteStrategies-server-1136199599</nova:name>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:29:29</nova:creationTime>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:29:29 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:29:29 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:29:29 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:29:29 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:29:29 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:29:29 compute-0 nova_compute[183278]:         <nova:user uuid="41dc6e790bc54fbfaf5c6007d3fa5f63">tempest-TestExecuteStrategies-1753607426-project-member</nova:user>
Jan 21 18:29:29 compute-0 nova_compute[183278]:         <nova:project uuid="fe688847145f4dee992c72dd40bbc1ac">tempest-TestExecuteStrategies-1753607426</nova:project>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:29:29 compute-0 nova_compute[183278]:         <nova:port uuid="5acb086b-a54d-4c3a-a7af-bea2feeff1b4">
Jan 21 18:29:29 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <system>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <entry name="serial">0df2cec9-7a16-40a4-96bf-c39f8782df91</entry>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <entry name="uuid">0df2cec9-7a16-40a4-96bf-c39f8782df91</entry>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     </system>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <os>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   </os>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <features>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   </features>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk.config"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:ab:ae:e1"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <target dev="tap5acb086b-a5"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/console.log" append="off"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <video>
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     </video>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:29:29 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:29:29 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:29:29 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:29:29 compute-0 nova_compute[183278]: </domain>
Jan 21 18:29:29 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.523 183284 DEBUG nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Preparing to wait for external event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.523 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.523 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.523 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.524 183284 DEBUG nova.virt.libvirt.vif [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:29:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1136199599',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1136199599',id=17,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-f59f2q57',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:29:26Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=0df2cec9-7a16-40a4-96bf-c39f8782df91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.524 183284 DEBUG nova.network.os_vif_util [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.525 183284 DEBUG nova.network.os_vif_util [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ae:e1,bridge_name='br-int',has_traffic_filtering=True,id=5acb086b-a54d-4c3a-a7af-bea2feeff1b4,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5acb086b-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.525 183284 DEBUG os_vif [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ae:e1,bridge_name='br-int',has_traffic_filtering=True,id=5acb086b-a54d-4c3a-a7af-bea2feeff1b4,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5acb086b-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.525 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.526 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.526 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.528 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.528 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5acb086b-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.529 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5acb086b-a5, col_values=(('external_ids', {'iface-id': '5acb086b-a54d-4c3a-a7af-bea2feeff1b4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:ae:e1', 'vm-uuid': '0df2cec9-7a16-40a4-96bf-c39f8782df91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.530 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:29 compute-0 NetworkManager[55506]: <info>  [1769020169.5310] manager: (tap5acb086b-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.532 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.536 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.536 183284 INFO os_vif [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ae:e1,bridge_name='br-int',has_traffic_filtering=True,id=5acb086b-a54d-4c3a-a7af-bea2feeff1b4,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5acb086b-a5')
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.695 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.695 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.695 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No VIF found with MAC fa:16:3e:ab:ae:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.696 183284 INFO nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Using config drive
Jan 21 18:29:29 compute-0 podman[192560]: time="2026-01-21T18:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:29:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:29:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.958 183284 INFO nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Creating config drive at /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk.config
Jan 21 18:29:29 compute-0 nova_compute[183278]: 2026-01-21 18:29:29.963 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwc86q1s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.089 183284 DEBUG oslo_concurrency.processutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwc86q1s" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:29:30 compute-0 kernel: tap5acb086b-a5: entered promiscuous mode
Jan 21 18:29:30 compute-0 NetworkManager[55506]: <info>  [1769020170.1456] manager: (tap5acb086b-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.151 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:30 compute-0 ovn_controller[95419]: 2026-01-21T18:29:30Z|00127|binding|INFO|Claiming lport 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for this chassis.
Jan 21 18:29:30 compute-0 ovn_controller[95419]: 2026-01-21T18:29:30Z|00128|binding|INFO|5acb086b-a54d-4c3a-a7af-bea2feeff1b4: Claiming fa:16:3e:ab:ae:e1 10.100.0.6
Jan 21 18:29:30 compute-0 systemd-udevd[209274]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:29:30 compute-0 systemd-machined[154592]: New machine qemu-12-instance-00000011.
Jan 21 18:29:30 compute-0 NetworkManager[55506]: <info>  [1769020170.1829] device (tap5acb086b-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:29:30 compute-0 NetworkManager[55506]: <info>  [1769020170.1835] device (tap5acb086b-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:29:30 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000011.
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.203 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:ae:e1 10.100.0.6'], port_security=['fa:16:3e:ab:ae:e1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0df2cec9-7a16-40a4-96bf-c39f8782df91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=5acb086b-a54d-4c3a-a7af-bea2feeff1b4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.205 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf bound to our chassis
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.205 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.206 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:29:30 compute-0 ovn_controller[95419]: 2026-01-21T18:29:30Z|00129|binding|INFO|Setting lport 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 ovn-installed in OVS
Jan 21 18:29:30 compute-0 ovn_controller[95419]: 2026-01-21T18:29:30Z|00130|binding|INFO|Setting lport 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 up in Southbound
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.210 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.216 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[68d65c29-e377-4221-a8da-e28e5400c99c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.217 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap405ec01b-71 in ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.220 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap405ec01b-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.220 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[3cca0789-93a4-450b-ad4b-688690747fca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.220 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2c4143-0b95-4004-8445-f291c10d2c2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.232 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9e1cf9-b3cd-4dee-aff1-df0b6c61304b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.255 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b8c7a2-dd52-48e6-adc1-21e5b6e96cd6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.281 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[92e58fcf-966b-4b7f-8958-70aa25e39242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.287 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7e83a9b5-bf03-41a7-abf5-6f07cb0ba9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 NetworkManager[55506]: <info>  [1769020170.2881] manager: (tap405ec01b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Jan 21 18:29:30 compute-0 systemd-udevd[209277]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.322 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[a243c5f3-ae12-4b1e-a84d-02c6f917a67b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.324 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ecc963-4829-420d-bf3b-6def00e79ac5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 NetworkManager[55506]: <info>  [1769020170.3441] device (tap405ec01b-70): carrier: link connected
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.350 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[54d9dbf7-7cb9-4d93-9a24-4ee615da7e35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.367 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2766e08e-461e-486c-b8a8-48f55dd1e6f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473395, 'reachable_time': 16088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209308, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.380 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[c4265dc7-b65c-4b0c-89d2-1c1eb1db3b7c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:9502'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 473395, 'tstamp': 473395}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209309, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.398 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a1607064-c144-4dce-a7cd-b420c4440b38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473395, 'reachable_time': 16088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209310, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.429 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[cb8f93d6-28f4-4309-9026-0a51e2afe92f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.489 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[6a127fb2-3f67-4756-896a-d4c17e7049b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.490 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.491 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.491 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:29:30 compute-0 kernel: tap405ec01b-70: entered promiscuous mode
Jan 21 18:29:30 compute-0 NetworkManager[55506]: <info>  [1769020170.4945] manager: (tap405ec01b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.494 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.496 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.497 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:30 compute-0 ovn_controller[95419]: 2026-01-21T18:29:30Z|00131|binding|INFO|Releasing lport 9c897ad2-8ce5-4903-8c83-1ed8f117dcdd from this chassis (sb_readonly=0)
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.509 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.510 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.510 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a041703c-bdf6-433d-bf81-4c937b111d76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.511 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:29:30 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:30.512 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'env', 'PROCESS_TAG=haproxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.532 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020170.5316412, 0df2cec9-7a16-40a4-96bf-c39f8782df91 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.532 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] VM Started (Lifecycle Event)
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.553 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.556 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020170.531797, 0df2cec9-7a16-40a4-96bf-c39f8782df91 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.556 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] VM Paused (Lifecycle Event)
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.580 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.582 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.604 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:29:30 compute-0 nova_compute[183278]: 2026-01-21 18:29:30.769 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:30 compute-0 podman[209349]: 2026-01-21 18:29:30.828649029 +0000 UTC m=+0.023892749 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.054 183284 DEBUG nova.compute.manager [req-5a4b2130-93a5-4f87-98b2-ee35edc2c062 req-df818dec-01a2-4769-a504-ac383c0ac767 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.055 183284 DEBUG oslo_concurrency.lockutils [req-5a4b2130-93a5-4f87-98b2-ee35edc2c062 req-df818dec-01a2-4769-a504-ac383c0ac767 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.055 183284 DEBUG oslo_concurrency.lockutils [req-5a4b2130-93a5-4f87-98b2-ee35edc2c062 req-df818dec-01a2-4769-a504-ac383c0ac767 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.055 183284 DEBUG oslo_concurrency.lockutils [req-5a4b2130-93a5-4f87-98b2-ee35edc2c062 req-df818dec-01a2-4769-a504-ac383c0ac767 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.056 183284 DEBUG nova.compute.manager [req-5a4b2130-93a5-4f87-98b2-ee35edc2c062 req-df818dec-01a2-4769-a504-ac383c0ac767 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Processing event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.056 183284 DEBUG nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.059 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020171.0592167, 0df2cec9-7a16-40a4-96bf-c39f8782df91 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.059 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] VM Resumed (Lifecycle Event)
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.061 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.064 183284 INFO nova.virt.libvirt.driver [-] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Instance spawned successfully.
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.064 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.082 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.087 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.090 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.090 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.091 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.091 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.092 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.092 183284 DEBUG nova.virt.libvirt.driver [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.125 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.155 183284 INFO nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Took 5.05 seconds to spawn the instance on the hypervisor.
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.156 183284 DEBUG nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.177 183284 DEBUG nova.network.neutron [req-3506cead-719c-4ebe-892e-5a448818f377 req-36f9022f-77b5-4570-8fb7-dc5082259eaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Updated VIF entry in instance network info cache for port 5acb086b-a54d-4c3a-a7af-bea2feeff1b4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.180 183284 DEBUG nova.network.neutron [req-3506cead-719c-4ebe-892e-5a448818f377 req-36f9022f-77b5-4570-8fb7-dc5082259eaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Updating instance_info_cache with network_info: [{"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.203 183284 DEBUG oslo_concurrency.lockutils [req-3506cead-719c-4ebe-892e-5a448818f377 req-36f9022f-77b5-4570-8fb7-dc5082259eaf 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.220 183284 INFO nova.compute.manager [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Took 5.46 seconds to build instance.
Jan 21 18:29:31 compute-0 nova_compute[183278]: 2026-01-21 18:29:31.238 183284 DEBUG oslo_concurrency.lockutils [None req-92154ee5-2dd9-4754-81eb-d59c2d5373a4 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.849s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:31 compute-0 openstack_network_exporter[195402]: ERROR   18:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:29:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:29:31 compute-0 openstack_network_exporter[195402]: ERROR   18:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:29:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:29:32 compute-0 podman[209362]: 2026-01-21 18:29:32.793626322 +0000 UTC m=+0.847875858 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git)
Jan 21 18:29:33 compute-0 podman[209349]: 2026-01-21 18:29:33.027017935 +0000 UTC m=+2.222261645 container create 430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 18:29:33 compute-0 systemd[1]: Started libpod-conmon-430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f.scope.
Jan 21 18:29:33 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:29:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0289f9466c8dd285f3bbaff940e3095917b75b6332b77401cb3f6ca64d66201/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:29:33 compute-0 podman[209349]: 2026-01-21 18:29:33.11408618 +0000 UTC m=+2.309329920 container init 430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 18:29:33 compute-0 podman[209349]: 2026-01-21 18:29:33.120034564 +0000 UTC m=+2.315278264 container start 430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:29:33 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[209388]: [NOTICE]   (209392) : New worker (209394) forked
Jan 21 18:29:33 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[209388]: [NOTICE]   (209392) : Loading success.
Jan 21 18:29:33 compute-0 nova_compute[183278]: 2026-01-21 18:29:33.147 183284 DEBUG nova.compute.manager [req-a6842b0b-1382-4779-8834-61a5a4568637 req-8f96a2f2-0d14-4da0-a9c4-6683c84007d0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:29:33 compute-0 nova_compute[183278]: 2026-01-21 18:29:33.148 183284 DEBUG oslo_concurrency.lockutils [req-a6842b0b-1382-4779-8834-61a5a4568637 req-8f96a2f2-0d14-4da0-a9c4-6683c84007d0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:29:33 compute-0 nova_compute[183278]: 2026-01-21 18:29:33.148 183284 DEBUG oslo_concurrency.lockutils [req-a6842b0b-1382-4779-8834-61a5a4568637 req-8f96a2f2-0d14-4da0-a9c4-6683c84007d0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:29:33 compute-0 nova_compute[183278]: 2026-01-21 18:29:33.149 183284 DEBUG oslo_concurrency.lockutils [req-a6842b0b-1382-4779-8834-61a5a4568637 req-8f96a2f2-0d14-4da0-a9c4-6683c84007d0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:29:33 compute-0 nova_compute[183278]: 2026-01-21 18:29:33.149 183284 DEBUG nova.compute.manager [req-a6842b0b-1382-4779-8834-61a5a4568637 req-8f96a2f2-0d14-4da0-a9c4-6683c84007d0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] No waiting events found dispatching network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:29:33 compute-0 nova_compute[183278]: 2026-01-21 18:29:33.149 183284 WARNING nova.compute.manager [req-a6842b0b-1382-4779-8834-61a5a4568637 req-8f96a2f2-0d14-4da0-a9c4-6683c84007d0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received unexpected event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for instance with vm_state active and task_state None.
Jan 21 18:29:34 compute-0 nova_compute[183278]: 2026-01-21 18:29:34.532 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:35 compute-0 nova_compute[183278]: 2026-01-21 18:29:35.801 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:38 compute-0 podman[209405]: 2026-01-21 18:29:38.00571125 +0000 UTC m=+0.055401651 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:29:38 compute-0 podman[209404]: 2026-01-21 18:29:38.038028211 +0000 UTC m=+0.089947825 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:29:38 compute-0 nova_compute[183278]: 2026-01-21 18:29:38.209 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:38 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:38.208 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:29:38 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:38.209 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:29:38 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:29:38.210 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:29:39 compute-0 nova_compute[183278]: 2026-01-21 18:29:39.534 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:40 compute-0 nova_compute[183278]: 2026-01-21 18:29:40.802 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:42 compute-0 podman[209446]: 2026-01-21 18:29:42.990639144 +0000 UTC m=+0.043535593 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:29:44 compute-0 nova_compute[183278]: 2026-01-21 18:29:44.538 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:45 compute-0 nova_compute[183278]: 2026-01-21 18:29:45.804 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:45 compute-0 ovn_controller[95419]: 2026-01-21T18:29:45Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:ae:e1 10.100.0.6
Jan 21 18:29:45 compute-0 ovn_controller[95419]: 2026-01-21T18:29:45Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:ae:e1 10.100.0.6
Jan 21 18:29:49 compute-0 nova_compute[183278]: 2026-01-21 18:29:49.577 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:50 compute-0 nova_compute[183278]: 2026-01-21 18:29:50.805 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:54 compute-0 nova_compute[183278]: 2026-01-21 18:29:54.580 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:55 compute-0 nova_compute[183278]: 2026-01-21 18:29:55.807 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:59 compute-0 nova_compute[183278]: 2026-01-21 18:29:59.582 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:29:59 compute-0 podman[192560]: time="2026-01-21T18:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:29:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:29:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2631 "" "Go-http-client/1.1"
Jan 21 18:30:00 compute-0 nova_compute[183278]: 2026-01-21 18:30:00.809 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:01 compute-0 openstack_network_exporter[195402]: ERROR   18:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:30:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:30:01 compute-0 openstack_network_exporter[195402]: ERROR   18:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:30:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:30:04 compute-0 podman[209488]: 2026-01-21 18:30:04.027731037 +0000 UTC m=+0.082995327 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:30:04 compute-0 nova_compute[183278]: 2026-01-21 18:30:04.585 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:05 compute-0 nova_compute[183278]: 2026-01-21 18:30:05.810 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:08 compute-0 ovn_controller[95419]: 2026-01-21T18:30:08Z|00132|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 21 18:30:08 compute-0 podman[209511]: 2026-01-21 18:30:08.991284877 +0000 UTC m=+0.045461110 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:30:09 compute-0 podman[209510]: 2026-01-21 18:30:09.016612629 +0000 UTC m=+0.075161417 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 21 18:30:09 compute-0 nova_compute[183278]: 2026-01-21 18:30:09.586 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:10 compute-0 nova_compute[183278]: 2026-01-21 18:30:10.811 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:11 compute-0 nova_compute[183278]: 2026-01-21 18:30:11.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:11 compute-0 nova_compute[183278]: 2026-01-21 18:30:11.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 18:30:13 compute-0 podman[209556]: 2026-01-21 18:30:13.988443947 +0000 UTC m=+0.044729562 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:30:14 compute-0 nova_compute[183278]: 2026-01-21 18:30:14.589 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:15 compute-0 nova_compute[183278]: 2026-01-21 18:30:15.813 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:19 compute-0 nova_compute[183278]: 2026-01-21 18:30:19.593 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:19 compute-0 nova_compute[183278]: 2026-01-21 18:30:19.854 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:19 compute-0 nova_compute[183278]: 2026-01-21 18:30:19.854 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:30:19 compute-0 nova_compute[183278]: 2026-01-21 18:30:19.854 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:30:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:20.089 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:20.089 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:20.089 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:20 compute-0 nova_compute[183278]: 2026-01-21 18:30:20.517 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:30:20 compute-0 nova_compute[183278]: 2026-01-21 18:30:20.517 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:30:20 compute-0 nova_compute[183278]: 2026-01-21 18:30:20.517 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:30:20 compute-0 nova_compute[183278]: 2026-01-21 18:30:20.517 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0df2cec9-7a16-40a4-96bf-c39f8782df91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:30:20 compute-0 nova_compute[183278]: 2026-01-21 18:30:20.814 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:21 compute-0 nova_compute[183278]: 2026-01-21 18:30:21.968 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Updating instance_info_cache with network_info: [{"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.083 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.083 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.083 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.084 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.113 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.113 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.113 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.114 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.178 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.231 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.232 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.282 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.431 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.432 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5699MB free_disk=73.35226440429688GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.433 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.433 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.538 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance 0df2cec9-7a16-40a4-96bf-c39f8782df91 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.539 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.539 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.640 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.662 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.684 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.684 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.684 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.685 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 18:30:22 compute-0 nova_compute[183278]: 2026-01-21 18:30:22.697 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 18:30:24 compute-0 nova_compute[183278]: 2026-01-21 18:30:24.596 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.040 183284 DEBUG nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Check if temp file /var/lib/nova/instances/tmpumsco_yl exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.041 183284 DEBUG nova.compute.manager [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpumsco_yl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0df2cec9-7a16-40a4-96bf-c39f8782df91',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.430 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.430 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.542 183284 DEBUG oslo_concurrency.processutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.618 183284 DEBUG oslo_concurrency.processutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.620 183284 DEBUG oslo_concurrency.processutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.679 183284 DEBUG oslo_concurrency.processutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.815 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:25 compute-0 nova_compute[183278]: 2026-01-21 18:30:25.859 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:29 compute-0 sshd-session[209593]: Accepted publickey for nova from 192.168.122.101 port 58818 ssh2: ECDSA SHA256:29a5JNhHHz2bb0ACqZTr6qOKeSRnhiTRA8SK+rzn9gs
Jan 21 18:30:29 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:30:29 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:30:29 compute-0 systemd-logind[782]: New session 36 of user nova.
Jan 21 18:30:29 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:30:29 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:30:29 compute-0 systemd[209597]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:30:29 compute-0 nova_compute[183278]: 2026-01-21 18:30:29.598 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:29 compute-0 systemd[209597]: Queued start job for default target Main User Target.
Jan 21 18:30:29 compute-0 systemd[209597]: Created slice User Application Slice.
Jan 21 18:30:29 compute-0 systemd[209597]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:30:29 compute-0 systemd[209597]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:30:29 compute-0 systemd[209597]: Reached target Paths.
Jan 21 18:30:29 compute-0 systemd[209597]: Reached target Timers.
Jan 21 18:30:29 compute-0 systemd[209597]: Starting D-Bus User Message Bus Socket...
Jan 21 18:30:29 compute-0 systemd[209597]: Starting Create User's Volatile Files and Directories...
Jan 21 18:30:29 compute-0 systemd[209597]: Finished Create User's Volatile Files and Directories.
Jan 21 18:30:29 compute-0 systemd[209597]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:30:29 compute-0 systemd[209597]: Reached target Sockets.
Jan 21 18:30:29 compute-0 systemd[209597]: Reached target Basic System.
Jan 21 18:30:29 compute-0 systemd[209597]: Reached target Main User Target.
Jan 21 18:30:29 compute-0 systemd[209597]: Startup finished in 116ms.
Jan 21 18:30:29 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:30:29 compute-0 systemd[1]: Started Session 36 of User nova.
Jan 21 18:30:29 compute-0 sshd-session[209593]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:30:29 compute-0 podman[192560]: time="2026-01-21T18:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:30:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:30:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Jan 21 18:30:29 compute-0 sshd-session[209612]: Received disconnect from 192.168.122.101 port 58818:11: disconnected by user
Jan 21 18:30:29 compute-0 sshd-session[209612]: Disconnected from user nova 192.168.122.101 port 58818
Jan 21 18:30:29 compute-0 sshd-session[209593]: pam_unix(sshd:session): session closed for user nova
Jan 21 18:30:29 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Jan 21 18:30:29 compute-0 systemd-logind[782]: Session 36 logged out. Waiting for processes to exit.
Jan 21 18:30:29 compute-0 systemd-logind[782]: Removed session 36.
Jan 21 18:30:29 compute-0 nova_compute[183278]: 2026-01-21 18:30:29.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:29 compute-0 nova_compute[183278]: 2026-01-21 18:30:29.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:29 compute-0 nova_compute[183278]: 2026-01-21 18:30:29.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:30:30 compute-0 nova_compute[183278]: 2026-01-21 18:30:30.845 183284 DEBUG nova.compute.manager [req-1d209b2d-9539-4464-937d-437e9d80a7e1 req-70f41cfa-7ac8-470d-b727-5b2a44e673e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-unplugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:30:30 compute-0 nova_compute[183278]: 2026-01-21 18:30:30.845 183284 DEBUG oslo_concurrency.lockutils [req-1d209b2d-9539-4464-937d-437e9d80a7e1 req-70f41cfa-7ac8-470d-b727-5b2a44e673e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:30 compute-0 nova_compute[183278]: 2026-01-21 18:30:30.846 183284 DEBUG oslo_concurrency.lockutils [req-1d209b2d-9539-4464-937d-437e9d80a7e1 req-70f41cfa-7ac8-470d-b727-5b2a44e673e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:30 compute-0 nova_compute[183278]: 2026-01-21 18:30:30.846 183284 DEBUG oslo_concurrency.lockutils [req-1d209b2d-9539-4464-937d-437e9d80a7e1 req-70f41cfa-7ac8-470d-b727-5b2a44e673e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:30 compute-0 nova_compute[183278]: 2026-01-21 18:30:30.846 183284 DEBUG nova.compute.manager [req-1d209b2d-9539-4464-937d-437e9d80a7e1 req-70f41cfa-7ac8-470d-b727-5b2a44e673e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] No waiting events found dispatching network-vif-unplugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:30:30 compute-0 nova_compute[183278]: 2026-01-21 18:30:30.846 183284 DEBUG nova.compute.manager [req-1d209b2d-9539-4464-937d-437e9d80a7e1 req-70f41cfa-7ac8-470d-b727-5b2a44e673e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-unplugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:30:30 compute-0 nova_compute[183278]: 2026-01-21 18:30:30.859 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:31 compute-0 openstack_network_exporter[195402]: ERROR   18:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:30:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:30:31 compute-0 openstack_network_exporter[195402]: ERROR   18:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:30:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.590 183284 INFO nova.compute.manager [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Took 6.91 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.591 183284 DEBUG nova.compute.manager [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.613 183284 DEBUG nova.compute.manager [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpumsco_yl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0df2cec9-7a16-40a4-96bf-c39f8782df91',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(fe5ca158-35cc-4c7e-ab91-8b4e19a8a4f8),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.635 183284 DEBUG nova.objects.instance [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid 0df2cec9-7a16-40a4-96bf-c39f8782df91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.636 183284 DEBUG nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.637 183284 DEBUG nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.637 183284 DEBUG nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.652 183284 DEBUG nova.virt.libvirt.vif [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:29:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1136199599',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1136199599',id=17,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:29:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-f59f2q57',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:29:31Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=0df2cec9-7a16-40a4-96bf-c39f8782df91,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.652 183284 DEBUG nova.network.os_vif_util [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.653 183284 DEBUG nova.network.os_vif_util [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:ae:e1,bridge_name='br-int',has_traffic_filtering=True,id=5acb086b-a54d-4c3a-a7af-bea2feeff1b4,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5acb086b-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.653 183284 DEBUG nova.virt.libvirt.migration [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 18:30:32 compute-0 nova_compute[183278]:   <mac address="fa:16:3e:ab:ae:e1"/>
Jan 21 18:30:32 compute-0 nova_compute[183278]:   <model type="virtio"/>
Jan 21 18:30:32 compute-0 nova_compute[183278]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:30:32 compute-0 nova_compute[183278]:   <mtu size="1442"/>
Jan 21 18:30:32 compute-0 nova_compute[183278]:   <target dev="tap5acb086b-a5"/>
Jan 21 18:30:32 compute-0 nova_compute[183278]: </interface>
Jan 21 18:30:32 compute-0 nova_compute[183278]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.654 183284 DEBUG nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.933 183284 DEBUG nova.compute.manager [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.934 183284 DEBUG oslo_concurrency.lockutils [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.934 183284 DEBUG oslo_concurrency.lockutils [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.935 183284 DEBUG oslo_concurrency.lockutils [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.935 183284 DEBUG nova.compute.manager [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] No waiting events found dispatching network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.935 183284 WARNING nova.compute.manager [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received unexpected event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for instance with vm_state active and task_state migrating.
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.935 183284 DEBUG nova.compute.manager [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-changed-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.935 183284 DEBUG nova.compute.manager [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Refreshing instance network info cache due to event network-changed-5acb086b-a54d-4c3a-a7af-bea2feeff1b4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.936 183284 DEBUG oslo_concurrency.lockutils [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.936 183284 DEBUG oslo_concurrency.lockutils [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:30:32 compute-0 nova_compute[183278]: 2026-01-21 18:30:32.936 183284 DEBUG nova.network.neutron [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Refreshing network info cache for port 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:30:33 compute-0 nova_compute[183278]: 2026-01-21 18:30:33.140 183284 DEBUG nova.virt.libvirt.migration [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:30:33 compute-0 nova_compute[183278]: 2026-01-21 18:30:33.141 183284 INFO nova.virt.libvirt.migration [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 18:30:33 compute-0 nova_compute[183278]: 2026-01-21 18:30:33.210 183284 INFO nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 18:30:33 compute-0 nova_compute[183278]: 2026-01-21 18:30:33.714 183284 DEBUG nova.virt.libvirt.migration [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:30:33 compute-0 nova_compute[183278]: 2026-01-21 18:30:33.715 183284 DEBUG nova.virt.libvirt.migration [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.218 183284 DEBUG nova.virt.libvirt.migration [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.219 183284 DEBUG nova.virt.libvirt.migration [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:30:34 compute-0 sshd-session[209614]: Received disconnect from 125.124.24.140 port 33498:11:  [preauth]
Jan 21 18:30:34 compute-0 sshd-session[209614]: Disconnected from authenticating user operator 125.124.24.140 port 33498 [preauth]
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.308 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020234.307598, 0df2cec9-7a16-40a4-96bf-c39f8782df91 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.308 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] VM Paused (Lifecycle Event)
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.329 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.334 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.367 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.601 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.675 183284 DEBUG nova.network.neutron [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Updated VIF entry in instance network info cache for port 5acb086b-a54d-4c3a-a7af-bea2feeff1b4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.675 183284 DEBUG nova.network.neutron [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Updating instance_info_cache with network_info: [{"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.694 183284 DEBUG oslo_concurrency.lockutils [req-0943061b-f343-471f-8172-68a73095b2e4 req-96178673-625d-4292-a390-7aca9b7b2e3e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-0df2cec9-7a16-40a4-96bf-c39f8782df91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.722 183284 DEBUG nova.virt.libvirt.migration [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.723 183284 DEBUG nova.virt.libvirt.migration [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:30:34 compute-0 kernel: tap5acb086b-a5 (unregistering): left promiscuous mode
Jan 21 18:30:34 compute-0 NetworkManager[55506]: <info>  [1769020234.8351] device (tap5acb086b-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:30:34 compute-0 ovn_controller[95419]: 2026-01-21T18:30:34Z|00133|binding|INFO|Releasing lport 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 from this chassis (sb_readonly=0)
Jan 21 18:30:34 compute-0 ovn_controller[95419]: 2026-01-21T18:30:34Z|00134|binding|INFO|Setting lport 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 down in Southbound
Jan 21 18:30:34 compute-0 ovn_controller[95419]: 2026-01-21T18:30:34Z|00135|binding|INFO|Removing iface tap5acb086b-a5 ovn-installed in OVS
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.886 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:34 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:34.896 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:ae:e1 10.100.0.6'], port_security=['fa:16:3e:ab:ae:e1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '88a62794-b4a4-47e3-9cce-91e574e684c1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0df2cec9-7a16-40a4-96bf-c39f8782df91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '8', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=5acb086b-a54d-4c3a-a7af-bea2feeff1b4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:30:34 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:34.898 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf unbound from our chassis
Jan 21 18:30:34 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:34.899 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:30:34 compute-0 nova_compute[183278]: 2026-01-21 18:30:34.900 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:34 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:34.900 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[631dab29-daa9-4710-9ee4-7b14441459d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:30:34 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:34.901 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace which is not needed anymore
Jan 21 18:30:34 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 21 18:30:34 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Consumed 16.246s CPU time.
Jan 21 18:30:34 compute-0 systemd-machined[154592]: Machine qemu-12-instance-00000011 terminated.
Jan 21 18:30:34 compute-0 podman[209634]: 2026-01-21 18:30:34.975480329 +0000 UTC m=+0.069052551 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, version=9.6, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Jan 21 18:30:35 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[209388]: [NOTICE]   (209392) : haproxy version is 2.8.14-c23fe91
Jan 21 18:30:35 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[209388]: [NOTICE]   (209392) : path to executable is /usr/sbin/haproxy
Jan 21 18:30:35 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[209388]: [WARNING]  (209392) : Exiting Master process...
Jan 21 18:30:35 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[209388]: [ALERT]    (209392) : Current worker (209394) exited with code 143 (Terminated)
Jan 21 18:30:35 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[209388]: [WARNING]  (209392) : All workers exited. Exiting... (0)
Jan 21 18:30:35 compute-0 systemd[1]: libpod-430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f.scope: Deactivated successfully.
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.037 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.041 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:35 compute-0 podman[209676]: 2026-01-21 18:30:35.042028037 +0000 UTC m=+0.047819197 container died 430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 18:30:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f-userdata-shm.mount: Deactivated successfully.
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.079 183284 DEBUG nova.compute.manager [req-96acaadc-8879-4a22-86cd-eedca505dbf5 req-dc0f869e-1662-4a03-bbe0-f8dfdff81389 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-unplugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.079 183284 DEBUG oslo_concurrency.lockutils [req-96acaadc-8879-4a22-86cd-eedca505dbf5 req-dc0f869e-1662-4a03-bbe0-f8dfdff81389 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.080 183284 DEBUG oslo_concurrency.lockutils [req-96acaadc-8879-4a22-86cd-eedca505dbf5 req-dc0f869e-1662-4a03-bbe0-f8dfdff81389 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.080 183284 DEBUG oslo_concurrency.lockutils [req-96acaadc-8879-4a22-86cd-eedca505dbf5 req-dc0f869e-1662-4a03-bbe0-f8dfdff81389 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.080 183284 DEBUG nova.compute.manager [req-96acaadc-8879-4a22-86cd-eedca505dbf5 req-dc0f869e-1662-4a03-bbe0-f8dfdff81389 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] No waiting events found dispatching network-vif-unplugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.080 183284 DEBUG nova.compute.manager [req-96acaadc-8879-4a22-86cd-eedca505dbf5 req-dc0f869e-1662-4a03-bbe0-f8dfdff81389 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-unplugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.081 183284 DEBUG nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.081 183284 DEBUG nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.082 183284 DEBUG nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 18:30:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-a0289f9466c8dd285f3bbaff940e3095917b75b6332b77401cb3f6ca64d66201-merged.mount: Deactivated successfully.
Jan 21 18:30:35 compute-0 podman[209676]: 2026-01-21 18:30:35.091606646 +0000 UTC m=+0.097397786 container cleanup 430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 18:30:35 compute-0 systemd[1]: libpod-conmon-430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f.scope: Deactivated successfully.
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.225 183284 DEBUG nova.virt.libvirt.guest [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '0df2cec9-7a16-40a4-96bf-c39f8782df91' (instance-00000011) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.226 183284 INFO nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Migration operation has completed
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.226 183284 INFO nova.compute.manager [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] _post_live_migration() is started..
Jan 21 18:30:35 compute-0 nova_compute[183278]: 2026-01-21 18:30:35.861 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.034 183284 DEBUG nova.network.neutron [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Activated binding for port 5acb086b-a54d-4c3a-a7af-bea2feeff1b4 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.035 183284 DEBUG nova.compute.manager [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.035 183284 DEBUG nova.virt.libvirt.vif [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:29:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1136199599',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1136199599',id=17,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:29:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-f59f2q57',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:30:23Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=0df2cec9-7a16-40a4-96bf-c39f8782df91,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.036 183284 DEBUG nova.network.os_vif_util [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "address": "fa:16:3e:ab:ae:e1", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5acb086b-a5", "ovs_interfaceid": "5acb086b-a54d-4c3a-a7af-bea2feeff1b4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.036 183284 DEBUG nova.network.os_vif_util [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:ae:e1,bridge_name='br-int',has_traffic_filtering=True,id=5acb086b-a54d-4c3a-a7af-bea2feeff1b4,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5acb086b-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.036 183284 DEBUG os_vif [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:ae:e1,bridge_name='br-int',has_traffic_filtering=True,id=5acb086b-a54d-4c3a-a7af-bea2feeff1b4,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5acb086b-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.038 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.038 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5acb086b-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.039 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.042 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.046 183284 INFO os_vif [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:ae:e1,bridge_name='br-int',has_traffic_filtering=True,id=5acb086b-a54d-4c3a-a7af-bea2feeff1b4,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5acb086b-a5')
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.046 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.046 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.047 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.047 183284 DEBUG nova.compute.manager [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.047 183284 INFO nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Deleting instance files /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91_del
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.048 183284 INFO nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Deletion of /var/lib/nova/instances/0df2cec9-7a16-40a4-96bf-c39f8782df91_del complete
Jan 21 18:30:36 compute-0 podman[209721]: 2026-01-21 18:30:36.076566628 +0000 UTC m=+0.959748923 container remove 430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:30:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:36.082 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[6995d357-6734-4c08-8784-e1959a706f97]: (4, ('Wed Jan 21 06:30:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f)\n430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f\nWed Jan 21 06:30:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f)\n430a5d02bf3a6734d96c8d3c91d695b8f377bfa48e24185af56d61aee24d192f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:30:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:36.083 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b87729f1-0197-4984-a99f-9513ab3effb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:30:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:36.084 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.085 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:36 compute-0 kernel: tap405ec01b-70: left promiscuous mode
Jan 21 18:30:36 compute-0 nova_compute[183278]: 2026-01-21 18:30:36.100 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:36.103 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b7417a-29a1-4ba9-8992-bdd1585b0874]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:30:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:36.116 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[288bc685-613e-47fc-badf-501898a3a610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:30:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:36.117 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[6496c331-370e-4dba-88d4-0842bedefe33]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:30:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:36.133 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d1c18a-de12-44fe-a888-de93f3b574b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 473388, 'reachable_time': 33426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209737, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:30:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:36.136 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:30:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:36.136 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[a7081fc2-5204-4976-8e64-fc4d0a3ffc48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:30:36 compute-0 systemd[1]: run-netns-ovnmeta\x2d405ec01b\x2d76d3\x2d4c3c\x2da31b\x2d5f16d9641fbf.mount: Deactivated successfully.
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.264 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.265 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.265 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.265 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.265 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] No waiting events found dispatching network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.265 183284 WARNING nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received unexpected event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for instance with vm_state active and task_state migrating.
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.266 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.266 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.266 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.266 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.266 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] No waiting events found dispatching network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.267 183284 WARNING nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received unexpected event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for instance with vm_state active and task_state migrating.
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.267 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-unplugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.267 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.267 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.267 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.267 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] No waiting events found dispatching network-vif-unplugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.268 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-unplugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.268 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.268 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.268 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.268 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.269 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] No waiting events found dispatching network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.269 183284 WARNING nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received unexpected event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for instance with vm_state active and task_state migrating.
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.269 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.269 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.269 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.270 183284 DEBUG oslo_concurrency.lockutils [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.270 183284 DEBUG nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] No waiting events found dispatching network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:30:37 compute-0 nova_compute[183278]: 2026-01-21 18:30:37.270 183284 WARNING nova.compute.manager [req-d7f0f838-cf79-428e-9ea3-d0a16087ccba req-165a2ae9-790f-4ba8-8d82-6e08e89d2bf9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Received unexpected event network-vif-plugged-5acb086b-a54d-4c3a-a7af-bea2feeff1b4 for instance with vm_state active and task_state migrating.
Jan 21 18:30:38 compute-0 sshd-session[209738]: Invalid user ubuntu from 39.191.29.114 port 48816
Jan 21 18:30:38 compute-0 sshd-session[209738]: Received disconnect from 39.191.29.114 port 48816:11:  [preauth]
Jan 21 18:30:38 compute-0 sshd-session[209738]: Disconnected from invalid user ubuntu 39.191.29.114 port 48816 [preauth]
Jan 21 18:30:39 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:30:39 compute-0 systemd[209597]: Activating special unit Exit the Session...
Jan 21 18:30:39 compute-0 systemd[209597]: Stopped target Main User Target.
Jan 21 18:30:39 compute-0 systemd[209597]: Stopped target Basic System.
Jan 21 18:30:39 compute-0 systemd[209597]: Stopped target Paths.
Jan 21 18:30:39 compute-0 systemd[209597]: Stopped target Sockets.
Jan 21 18:30:39 compute-0 systemd[209597]: Stopped target Timers.
Jan 21 18:30:39 compute-0 systemd[209597]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:30:39 compute-0 systemd[209597]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:30:39 compute-0 systemd[209597]: Closed D-Bus User Message Bus Socket.
Jan 21 18:30:39 compute-0 systemd[209597]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:30:39 compute-0 systemd[209597]: Removed slice User Application Slice.
Jan 21 18:30:39 compute-0 systemd[209597]: Reached target Shutdown.
Jan 21 18:30:39 compute-0 systemd[209597]: Finished Exit the Session.
Jan 21 18:30:39 compute-0 systemd[209597]: Reached target Exit the Session.
Jan 21 18:30:39 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:30:39 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:30:39 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:30:39 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:30:39 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:30:39 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:30:39 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:30:40 compute-0 podman[209741]: 2026-01-21 18:30:40.004634483 +0000 UTC m=+0.052145952 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 21 18:30:40 compute-0 podman[209740]: 2026-01-21 18:30:40.048677248 +0000 UTC m=+0.099134338 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.742 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.743 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.743 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "0df2cec9-7a16-40a4-96bf-c39f8782df91-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.763 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.764 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.764 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.764 183284 DEBUG nova.compute.resource_tracker [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.861 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.911 183284 WARNING nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.912 183284 DEBUG nova.compute.resource_tracker [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5816MB free_disk=73.3809585571289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.912 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.912 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.956 183284 DEBUG nova.compute.resource_tracker [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration for instance 0df2cec9-7a16-40a4-96bf-c39f8782df91 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:30:40 compute-0 nova_compute[183278]: 2026-01-21 18:30:40.974 183284 DEBUG nova.compute.resource_tracker [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.002 183284 DEBUG nova.compute.resource_tracker [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration fe5ca158-35cc-4c7e-ab91-8b4e19a8a4f8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.003 183284 DEBUG nova.compute.resource_tracker [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.003 183284 DEBUG nova.compute.resource_tracker [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.040 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.045 183284 DEBUG nova.compute.provider_tree [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.061 183284 DEBUG nova.scheduler.client.report [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.084 183284 DEBUG nova.compute.resource_tracker [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.085 183284 DEBUG oslo_concurrency.lockutils [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.090 183284 INFO nova.compute.manager [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.165 183284 INFO nova.scheduler.client.report [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Deleted allocation for migration fe5ca158-35cc-4c7e-ab91-8b4e19a8a4f8
Jan 21 18:30:41 compute-0 nova_compute[183278]: 2026-01-21 18:30:41.166 183284 DEBUG nova.virt.libvirt.driver [None req-f74d8c4b-77e6-477a-a450-06a2ad13a723 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 18:30:42 compute-0 nova_compute[183278]: 2026-01-21 18:30:42.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:30:45 compute-0 podman[209785]: 2026-01-21 18:30:45.006560109 +0000 UTC m=+0.058173308 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:30:45 compute-0 nova_compute[183278]: 2026-01-21 18:30:45.863 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:46 compute-0 nova_compute[183278]: 2026-01-21 18:30:46.041 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:47.543 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:30:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:47.544 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:30:47 compute-0 nova_compute[183278]: 2026-01-21 18:30:47.571 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:50 compute-0 nova_compute[183278]: 2026-01-21 18:30:50.077 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020235.0765605, 0df2cec9-7a16-40a4-96bf-c39f8782df91 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:30:50 compute-0 nova_compute[183278]: 2026-01-21 18:30:50.078 183284 INFO nova.compute.manager [-] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] VM Stopped (Lifecycle Event)
Jan 21 18:30:50 compute-0 nova_compute[183278]: 2026-01-21 18:30:50.159 183284 DEBUG nova.compute.manager [None req-bb0a979f-3993-4181-a555-a8144820aaf7 - - - - - -] [instance: 0df2cec9-7a16-40a4-96bf-c39f8782df91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:30:50 compute-0 nova_compute[183278]: 2026-01-21 18:30:50.865 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:51 compute-0 nova_compute[183278]: 2026-01-21 18:30:51.042 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:55 compute-0 nova_compute[183278]: 2026-01-21 18:30:55.867 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:56 compute-0 nova_compute[183278]: 2026-01-21 18:30:56.045 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:30:56 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:30:56.546 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:30:59 compute-0 podman[192560]: time="2026-01-21T18:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:30:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:30:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Jan 21 18:31:00 compute-0 nova_compute[183278]: 2026-01-21 18:31:00.869 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:01 compute-0 nova_compute[183278]: 2026-01-21 18:31:01.046 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:01 compute-0 openstack_network_exporter[195402]: ERROR   18:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:31:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:31:01 compute-0 openstack_network_exporter[195402]: ERROR   18:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:31:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:31:05 compute-0 nova_compute[183278]: 2026-01-21 18:31:05.871 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:06 compute-0 podman[209810]: 2026-01-21 18:31:06.017528948 +0000 UTC m=+0.060684137 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., architecture=x86_64)
Jan 21 18:31:06 compute-0 nova_compute[183278]: 2026-01-21 18:31:06.048 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:10 compute-0 nova_compute[183278]: 2026-01-21 18:31:10.874 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:11 compute-0 podman[209832]: 2026-01-21 18:31:11.014655478 +0000 UTC m=+0.063823053 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 18:31:11 compute-0 podman[209831]: 2026-01-21 18:31:11.030172523 +0000 UTC m=+0.085237592 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:31:11 compute-0 nova_compute[183278]: 2026-01-21 18:31:11.049 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:15 compute-0 nova_compute[183278]: 2026-01-21 18:31:15.876 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:15 compute-0 podman[209874]: 2026-01-21 18:31:15.99280022 +0000 UTC m=+0.045566773 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:31:16 compute-0 nova_compute[183278]: 2026-01-21 18:31:16.050 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:19 compute-0 nova_compute[183278]: 2026-01-21 18:31:19.828 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:19 compute-0 nova_compute[183278]: 2026-01-21 18:31:19.828 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:31:19 compute-0 nova_compute[183278]: 2026-01-21 18:31:19.828 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:31:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:20.090 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:20.091 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:20.091 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:20 compute-0 nova_compute[183278]: 2026-01-21 18:31:20.109 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:31:20 compute-0 ovn_controller[95419]: 2026-01-21T18:31:20Z|00136|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 21 18:31:20 compute-0 sshd-session[209898]: Connection closed by authenticating user root 64.227.98.100 port 39240 [preauth]
Jan 21 18:31:20 compute-0 nova_compute[183278]: 2026-01-21 18:31:20.878 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.052 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.848 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.848 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.848 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.849 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.992 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.993 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5837MB free_disk=73.38093948364258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.994 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:21 compute-0 nova_compute[183278]: 2026-01-21 18:31:21.994 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:22 compute-0 nova_compute[183278]: 2026-01-21 18:31:22.093 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:31:22 compute-0 nova_compute[183278]: 2026-01-21 18:31:22.093 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:31:22 compute-0 nova_compute[183278]: 2026-01-21 18:31:22.113 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:31:22 compute-0 nova_compute[183278]: 2026-01-21 18:31:22.126 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:31:22 compute-0 nova_compute[183278]: 2026-01-21 18:31:22.127 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:31:22 compute-0 nova_compute[183278]: 2026-01-21 18:31:22.128 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:25 compute-0 nova_compute[183278]: 2026-01-21 18:31:25.124 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:25 compute-0 nova_compute[183278]: 2026-01-21 18:31:25.125 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:25 compute-0 nova_compute[183278]: 2026-01-21 18:31:25.879 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:26 compute-0 nova_compute[183278]: 2026-01-21 18:31:26.053 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:26 compute-0 nova_compute[183278]: 2026-01-21 18:31:26.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:26 compute-0 nova_compute[183278]: 2026-01-21 18:31:26.841 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "c0c2fb2d-4e32-4586-8725-c987369255b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:26 compute-0 nova_compute[183278]: 2026-01-21 18:31:26.841 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:26 compute-0 nova_compute[183278]: 2026-01-21 18:31:26.859 183284 DEBUG nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:31:26 compute-0 nova_compute[183278]: 2026-01-21 18:31:26.933 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:26 compute-0 nova_compute[183278]: 2026-01-21 18:31:26.933 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:26 compute-0 nova_compute[183278]: 2026-01-21 18:31:26.939 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:31:26 compute-0 nova_compute[183278]: 2026-01-21 18:31:26.940 183284 INFO nova.compute.claims [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.759 183284 DEBUG nova.compute.provider_tree [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.774 183284 DEBUG nova.scheduler.client.report [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.797 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.864s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.798 183284 DEBUG nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.814 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.838 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.843 183284 DEBUG nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.843 183284 DEBUG nova.network.neutron [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.877 183284 INFO nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:31:27 compute-0 nova_compute[183278]: 2026-01-21 18:31:27.904 183284 DEBUG nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.015 183284 DEBUG nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.016 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.017 183284 INFO nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Creating image(s)
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.017 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "/var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.017 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.018 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.031 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.088 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.089 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.089 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.100 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.156 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.157 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.191 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.192 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.193 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.243 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.244 183284 DEBUG nova.virt.disk.api [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Checking if we can resize image /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.245 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.305 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.306 183284 DEBUG nova.virt.disk.api [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Cannot resize image /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.306 183284 DEBUG nova.objects.instance [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'migration_context' on Instance uuid c0c2fb2d-4e32-4586-8725-c987369255b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.320 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.321 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Ensure instance console log exists: /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.321 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.321 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.322 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:28 compute-0 nova_compute[183278]: 2026-01-21 18:31:28.589 183284 DEBUG nova.policy [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41dc6e790bc54fbfaf5c6007d3fa5f63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:31:29 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:29.645 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:31:29 compute-0 nova_compute[183278]: 2026-01-21 18:31:29.646 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:29 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:29.646 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:31:29 compute-0 podman[192560]: time="2026-01-21T18:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:31:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:31:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Jan 21 18:31:30 compute-0 nova_compute[183278]: 2026-01-21 18:31:30.573 183284 DEBUG nova.network.neutron [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Successfully created port: 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:31:30 compute-0 nova_compute[183278]: 2026-01-21 18:31:30.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:30 compute-0 nova_compute[183278]: 2026-01-21 18:31:30.879 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:31 compute-0 nova_compute[183278]: 2026-01-21 18:31:31.055 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:31 compute-0 openstack_network_exporter[195402]: ERROR   18:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:31:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:31:31 compute-0 openstack_network_exporter[195402]: ERROR   18:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:31:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:31:31 compute-0 nova_compute[183278]: 2026-01-21 18:31:31.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:31:31 compute-0 nova_compute[183278]: 2026-01-21 18:31:31.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:31:33 compute-0 nova_compute[183278]: 2026-01-21 18:31:33.158 183284 DEBUG nova.network.neutron [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Successfully updated port: 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:31:33 compute-0 nova_compute[183278]: 2026-01-21 18:31:33.245 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "refresh_cache-c0c2fb2d-4e32-4586-8725-c987369255b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:31:33 compute-0 nova_compute[183278]: 2026-01-21 18:31:33.246 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquired lock "refresh_cache-c0c2fb2d-4e32-4586-8725-c987369255b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:31:33 compute-0 nova_compute[183278]: 2026-01-21 18:31:33.246 183284 DEBUG nova.network.neutron [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:31:33 compute-0 nova_compute[183278]: 2026-01-21 18:31:33.339 183284 DEBUG nova.compute.manager [req-2662d0a1-ba2e-44a7-a07a-e0dd74de47d8 req-d37733f8-bb8c-4dc2-914b-76fe102ad477 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Received event network-changed-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:31:33 compute-0 nova_compute[183278]: 2026-01-21 18:31:33.340 183284 DEBUG nova.compute.manager [req-2662d0a1-ba2e-44a7-a07a-e0dd74de47d8 req-d37733f8-bb8c-4dc2-914b-76fe102ad477 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Refreshing instance network info cache due to event network-changed-23b1c1c6-59de-47af-98b4-6d28c3f94ff3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:31:33 compute-0 nova_compute[183278]: 2026-01-21 18:31:33.340 183284 DEBUG oslo_concurrency.lockutils [req-2662d0a1-ba2e-44a7-a07a-e0dd74de47d8 req-d37733f8-bb8c-4dc2-914b-76fe102ad477 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-c0c2fb2d-4e32-4586-8725-c987369255b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:31:33 compute-0 nova_compute[183278]: 2026-01-21 18:31:33.488 183284 DEBUG nova.network.neutron [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.074 183284 DEBUG nova.network.neutron [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Updating instance_info_cache with network_info: [{"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.205 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Releasing lock "refresh_cache-c0c2fb2d-4e32-4586-8725-c987369255b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.205 183284 DEBUG nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Instance network_info: |[{"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.205 183284 DEBUG oslo_concurrency.lockutils [req-2662d0a1-ba2e-44a7-a07a-e0dd74de47d8 req-d37733f8-bb8c-4dc2-914b-76fe102ad477 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-c0c2fb2d-4e32-4586-8725-c987369255b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.206 183284 DEBUG nova.network.neutron [req-2662d0a1-ba2e-44a7-a07a-e0dd74de47d8 req-d37733f8-bb8c-4dc2-914b-76fe102ad477 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Refreshing network info cache for port 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.208 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Start _get_guest_xml network_info=[{"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.212 183284 WARNING nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.217 183284 DEBUG nova.virt.libvirt.host [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.218 183284 DEBUG nova.virt.libvirt.host [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.221 183284 DEBUG nova.virt.libvirt.host [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.222 183284 DEBUG nova.virt.libvirt.host [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.223 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.223 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.224 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.224 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.224 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.225 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.225 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.225 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.225 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.225 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.226 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.226 183284 DEBUG nova.virt.hardware [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.229 183284 DEBUG nova.virt.libvirt.vif [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:31:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-746758313',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-746758313',id=19,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-9qo34i17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:31:27Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=c0c2fb2d-4e32-4586-8725-c987369255b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.229 183284 DEBUG nova.network.os_vif_util [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.230 183284 DEBUG nova.network.os_vif_util [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:2a:51,bridge_name='br-int',has_traffic_filtering=True,id=23b1c1c6-59de-47af-98b4-6d28c3f94ff3,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23b1c1c6-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.231 183284 DEBUG nova.objects.instance [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid c0c2fb2d-4e32-4586-8725-c987369255b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.264 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <uuid>c0c2fb2d-4e32-4586-8725-c987369255b6</uuid>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <name>instance-00000013</name>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteStrategies-server-746758313</nova:name>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:31:34</nova:creationTime>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:31:34 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:31:34 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:31:34 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:31:34 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:31:34 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:31:34 compute-0 nova_compute[183278]:         <nova:user uuid="41dc6e790bc54fbfaf5c6007d3fa5f63">tempest-TestExecuteStrategies-1753607426-project-member</nova:user>
Jan 21 18:31:34 compute-0 nova_compute[183278]:         <nova:project uuid="fe688847145f4dee992c72dd40bbc1ac">tempest-TestExecuteStrategies-1753607426</nova:project>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:31:34 compute-0 nova_compute[183278]:         <nova:port uuid="23b1c1c6-59de-47af-98b4-6d28c3f94ff3">
Jan 21 18:31:34 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <system>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <entry name="serial">c0c2fb2d-4e32-4586-8725-c987369255b6</entry>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <entry name="uuid">c0c2fb2d-4e32-4586-8725-c987369255b6</entry>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     </system>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <os>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   </os>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <features>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   </features>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk.config"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:1e:2a:51"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <target dev="tap23b1c1c6-59"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/console.log" append="off"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <video>
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     </video>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:31:34 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:31:34 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:31:34 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:31:34 compute-0 nova_compute[183278]: </domain>
Jan 21 18:31:34 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.265 183284 DEBUG nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Preparing to wait for external event network-vif-plugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.266 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.266 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.266 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.267 183284 DEBUG nova.virt.libvirt.vif [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:31:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-746758313',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-746758313',id=19,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-9qo34i17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:31:27Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=c0c2fb2d-4e32-4586-8725-c987369255b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.267 183284 DEBUG nova.network.os_vif_util [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.268 183284 DEBUG nova.network.os_vif_util [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:2a:51,bridge_name='br-int',has_traffic_filtering=True,id=23b1c1c6-59de-47af-98b4-6d28c3f94ff3,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23b1c1c6-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.268 183284 DEBUG os_vif [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:2a:51,bridge_name='br-int',has_traffic_filtering=True,id=23b1c1c6-59de-47af-98b4-6d28c3f94ff3,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23b1c1c6-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.269 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.269 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.270 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.272 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.272 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23b1c1c6-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.272 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap23b1c1c6-59, col_values=(('external_ids', {'iface-id': '23b1c1c6-59de-47af-98b4-6d28c3f94ff3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:2a:51', 'vm-uuid': 'c0c2fb2d-4e32-4586-8725-c987369255b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:31:34 compute-0 NetworkManager[55506]: <info>  [1769020294.2748] manager: (tap23b1c1c6-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.275 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.278 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.280 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.281 183284 INFO os_vif [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:2a:51,bridge_name='br-int',has_traffic_filtering=True,id=23b1c1c6-59de-47af-98b4-6d28c3f94ff3,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23b1c1c6-59')
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.554 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.554 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.554 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No VIF found with MAC fa:16:3e:1e:2a:51, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:31:34 compute-0 nova_compute[183278]: 2026-01-21 18:31:34.555 183284 INFO nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Using config drive
Jan 21 18:31:35 compute-0 nova_compute[183278]: 2026-01-21 18:31:35.881 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:36 compute-0 nova_compute[183278]: 2026-01-21 18:31:36.585 183284 INFO nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Creating config drive at /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk.config
Jan 21 18:31:36 compute-0 nova_compute[183278]: 2026-01-21 18:31:36.591 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1v86cdk8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:31:36 compute-0 nova_compute[183278]: 2026-01-21 18:31:36.715 183284 DEBUG oslo_concurrency.processutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1v86cdk8" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:31:36 compute-0 kernel: tap23b1c1c6-59: entered promiscuous mode
Jan 21 18:31:36 compute-0 NetworkManager[55506]: <info>  [1769020296.7891] manager: (tap23b1c1c6-59): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Jan 21 18:31:36 compute-0 ovn_controller[95419]: 2026-01-21T18:31:36Z|00137|binding|INFO|Claiming lport 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 for this chassis.
Jan 21 18:31:36 compute-0 nova_compute[183278]: 2026-01-21 18:31:36.788 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:36 compute-0 ovn_controller[95419]: 2026-01-21T18:31:36Z|00138|binding|INFO|23b1c1c6-59de-47af-98b4-6d28c3f94ff3: Claiming fa:16:3e:1e:2a:51 10.100.0.8
Jan 21 18:31:36 compute-0 ovn_controller[95419]: 2026-01-21T18:31:36Z|00139|binding|INFO|Setting lport 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 ovn-installed in OVS
Jan 21 18:31:36 compute-0 ovn_controller[95419]: 2026-01-21T18:31:36Z|00140|binding|INFO|Setting lport 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 up in Southbound
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.804 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:2a:51 10.100.0.8'], port_security=['fa:16:3e:1e:2a:51 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'c0c2fb2d-4e32-4586-8725-c987369255b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=23b1c1c6-59de-47af-98b4-6d28c3f94ff3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.805 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf bound to our chassis
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.806 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:31:36 compute-0 nova_compute[183278]: 2026-01-21 18:31:36.807 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:36 compute-0 nova_compute[183278]: 2026-01-21 18:31:36.812 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.818 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5537beed-1d86-4833-8977-335a472996f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.819 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap405ec01b-71 in ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.821 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap405ec01b-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.822 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1ee3e9-89c7-4bc4-a151-6f213f094bca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.823 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9151e8eb-57bd-4e8f-a608-4f081caac2b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 systemd-udevd[209947]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.832 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[66d3de67-a05e-4071-869f-1766a18867e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 systemd-machined[154592]: New machine qemu-13-instance-00000013.
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.845 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5c3de09b-6b8b-461d-9bd3-f7b0a06f54b7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 NetworkManager[55506]: <info>  [1769020296.8490] device (tap23b1c1c6-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:31:36 compute-0 NetworkManager[55506]: <info>  [1769020296.8498] device (tap23b1c1c6-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:31:36 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000013.
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.875 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[65a48a1c-7bd0-47ce-99cc-6a6ce1057016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 systemd-udevd[209960]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:31:36 compute-0 NetworkManager[55506]: <info>  [1769020296.8823] manager: (tap405ec01b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Jan 21 18:31:36 compute-0 podman[209927]: 2026-01-21 18:31:36.88259342 +0000 UTC m=+0.096295979 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.881 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f429e6da-6714-413b-96d1-0192da769a20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.910 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[422e8c0d-f3ad-4f27-8112-e5fb99eb4eed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.914 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[d7cc876b-130a-4565-9187-e21bd464529d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 NetworkManager[55506]: <info>  [1769020296.9397] device (tap405ec01b-70): carrier: link connected
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.944 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[42ef8e38-a79a-4538-9284-82c19c580752]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.962 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4140b92e-ed3f-48f1-97a5-3a17a7a9f40d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486054, 'reachable_time': 26720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209988, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.976 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cb36ad-3c0e-4a2a-a606-475b8a902734]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:9502'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486054, 'tstamp': 486054}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209989, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:36.990 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e4e2c1-72c6-454a-b1a1-2d72c183a303]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486054, 'reachable_time': 26720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209990, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.016 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[aa336dca-5ac7-48bb-ba08-7406669577a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.078 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb34e7e-a451-4573-bb54-454413fb2f9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.080 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.080 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.080 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.082 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:37 compute-0 kernel: tap405ec01b-70: entered promiscuous mode
Jan 21 18:31:37 compute-0 NetworkManager[55506]: <info>  [1769020297.0829] manager: (tap405ec01b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.086 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.087 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:37 compute-0 ovn_controller[95419]: 2026-01-21T18:31:37Z|00141|binding|INFO|Releasing lport 9c897ad2-8ce5-4903-8c83-1ed8f117dcdd from this chassis (sb_readonly=0)
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.090 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.098 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[44451b11-3281-4482-adcf-a4a7197ae67f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.099 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.099 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.100 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'env', 'PROCESS_TAG=haproxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.493 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020297.493114, c0c2fb2d-4e32-4586-8725-c987369255b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.494 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] VM Started (Lifecycle Event)
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.516 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.521 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020297.4940848, c0c2fb2d-4e32-4586-8725-c987369255b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.521 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] VM Paused (Lifecycle Event)
Jan 21 18:31:37 compute-0 podman[210023]: 2026-01-21 18:31:37.434348359 +0000 UTC m=+0.025400596 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.550 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.554 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.575 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.633 183284 DEBUG nova.compute.manager [req-b1373dcc-16ca-4627-afaa-8303226dedbb req-0192d021-c759-461c-a491-499c84fa4586 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Received event network-vif-plugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.634 183284 DEBUG oslo_concurrency.lockutils [req-b1373dcc-16ca-4627-afaa-8303226dedbb req-0192d021-c759-461c-a491-499c84fa4586 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.635 183284 DEBUG oslo_concurrency.lockutils [req-b1373dcc-16ca-4627-afaa-8303226dedbb req-0192d021-c759-461c-a491-499c84fa4586 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.635 183284 DEBUG oslo_concurrency.lockutils [req-b1373dcc-16ca-4627-afaa-8303226dedbb req-0192d021-c759-461c-a491-499c84fa4586 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.635 183284 DEBUG nova.compute.manager [req-b1373dcc-16ca-4627-afaa-8303226dedbb req-0192d021-c759-461c-a491-499c84fa4586 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Processing event network-vif-plugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.636 183284 DEBUG nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.647 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020297.6469967, c0c2fb2d-4e32-4586-8725-c987369255b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.647 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] VM Resumed (Lifecycle Event)
Jan 21 18:31:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:31:37.648 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.649 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.651 183284 INFO nova.virt.libvirt.driver [-] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Instance spawned successfully.
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.652 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.675 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.680 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.683 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.683 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.684 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.684 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.685 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.685 183284 DEBUG nova.virt.libvirt.driver [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.714 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.747 183284 INFO nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Took 9.73 seconds to spawn the instance on the hypervisor.
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.747 183284 DEBUG nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.816 183284 INFO nova.compute.manager [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Took 10.90 seconds to build instance.
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.833 183284 DEBUG oslo_concurrency.lockutils [None req-5bcf1263-a84e-45c6-85a0-355e6bb4d169 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.874 183284 DEBUG nova.network.neutron [req-2662d0a1-ba2e-44a7-a07a-e0dd74de47d8 req-d37733f8-bb8c-4dc2-914b-76fe102ad477 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Updated VIF entry in instance network info cache for port 23b1c1c6-59de-47af-98b4-6d28c3f94ff3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.874 183284 DEBUG nova.network.neutron [req-2662d0a1-ba2e-44a7-a07a-e0dd74de47d8 req-d37733f8-bb8c-4dc2-914b-76fe102ad477 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Updating instance_info_cache with network_info: [{"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:31:37 compute-0 nova_compute[183278]: 2026-01-21 18:31:37.887 183284 DEBUG oslo_concurrency.lockutils [req-2662d0a1-ba2e-44a7-a07a-e0dd74de47d8 req-d37733f8-bb8c-4dc2-914b-76fe102ad477 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-c0c2fb2d-4e32-4586-8725-c987369255b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:31:38 compute-0 podman[210023]: 2026-01-21 18:31:38.941121176 +0000 UTC m=+1.532173403 container create ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 18:31:39 compute-0 nova_compute[183278]: 2026-01-21 18:31:39.275 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:39 compute-0 systemd[1]: Started libpod-conmon-ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829.scope.
Jan 21 18:31:39 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:31:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f54d3ca2ab1a4670fd6c48c83e2af08ea3bdb66afec45da15317ad843a39e48/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:31:39 compute-0 podman[210023]: 2026-01-21 18:31:39.444938677 +0000 UTC m=+2.035990924 container init ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 18:31:39 compute-0 podman[210023]: 2026-01-21 18:31:39.454002556 +0000 UTC m=+2.045054773 container start ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 18:31:39 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210043]: [NOTICE]   (210047) : New worker (210049) forked
Jan 21 18:31:39 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210043]: [NOTICE]   (210047) : Loading success.
Jan 21 18:31:39 compute-0 nova_compute[183278]: 2026-01-21 18:31:39.783 183284 DEBUG nova.compute.manager [req-4555a8ea-6fc0-44b1-8b16-01dbb9552dcf req-ee47f480-8e4b-49f6-a262-d0a45f7d45e6 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Received event network-vif-plugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:31:39 compute-0 nova_compute[183278]: 2026-01-21 18:31:39.784 183284 DEBUG oslo_concurrency.lockutils [req-4555a8ea-6fc0-44b1-8b16-01dbb9552dcf req-ee47f480-8e4b-49f6-a262-d0a45f7d45e6 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:31:39 compute-0 nova_compute[183278]: 2026-01-21 18:31:39.785 183284 DEBUG oslo_concurrency.lockutils [req-4555a8ea-6fc0-44b1-8b16-01dbb9552dcf req-ee47f480-8e4b-49f6-a262-d0a45f7d45e6 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:31:39 compute-0 nova_compute[183278]: 2026-01-21 18:31:39.785 183284 DEBUG oslo_concurrency.lockutils [req-4555a8ea-6fc0-44b1-8b16-01dbb9552dcf req-ee47f480-8e4b-49f6-a262-d0a45f7d45e6 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:31:39 compute-0 nova_compute[183278]: 2026-01-21 18:31:39.785 183284 DEBUG nova.compute.manager [req-4555a8ea-6fc0-44b1-8b16-01dbb9552dcf req-ee47f480-8e4b-49f6-a262-d0a45f7d45e6 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] No waiting events found dispatching network-vif-plugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:31:39 compute-0 nova_compute[183278]: 2026-01-21 18:31:39.786 183284 WARNING nova.compute.manager [req-4555a8ea-6fc0-44b1-8b16-01dbb9552dcf req-ee47f480-8e4b-49f6-a262-d0a45f7d45e6 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Received unexpected event network-vif-plugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 for instance with vm_state active and task_state None.
Jan 21 18:31:40 compute-0 nova_compute[183278]: 2026-01-21 18:31:40.882 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:42 compute-0 podman[210059]: 2026-01-21 18:31:42.004420554 +0000 UTC m=+0.053078784 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 21 18:31:42 compute-0 podman[210058]: 2026-01-21 18:31:42.034945571 +0000 UTC m=+0.086685206 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:31:44 compute-0 nova_compute[183278]: 2026-01-21 18:31:44.278 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:45 compute-0 nova_compute[183278]: 2026-01-21 18:31:45.885 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:47 compute-0 podman[210100]: 2026-01-21 18:31:47.021489076 +0000 UTC m=+0.076651775 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:31:49 compute-0 nova_compute[183278]: 2026-01-21 18:31:49.281 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:50 compute-0 nova_compute[183278]: 2026-01-21 18:31:50.885 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:51 compute-0 ovn_controller[95419]: 2026-01-21T18:31:51Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:2a:51 10.100.0.8
Jan 21 18:31:51 compute-0 ovn_controller[95419]: 2026-01-21T18:31:51Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:2a:51 10.100.0.8
Jan 21 18:31:54 compute-0 nova_compute[183278]: 2026-01-21 18:31:54.284 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:55 compute-0 nova_compute[183278]: 2026-01-21 18:31:55.888 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:59 compute-0 nova_compute[183278]: 2026-01-21 18:31:59.286 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:31:59 compute-0 podman[192560]: time="2026-01-21T18:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:31:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16586 "" "Go-http-client/1.1"
Jan 21 18:31:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Jan 21 18:32:00 compute-0 nova_compute[183278]: 2026-01-21 18:32:00.890 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:01 compute-0 openstack_network_exporter[195402]: ERROR   18:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:32:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:32:01 compute-0 openstack_network_exporter[195402]: ERROR   18:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:32:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:32:04 compute-0 nova_compute[183278]: 2026-01-21 18:32:04.289 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:05 compute-0 nova_compute[183278]: 2026-01-21 18:32:05.893 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:07 compute-0 podman[210144]: 2026-01-21 18:32:07.01481432 +0000 UTC m=+0.057316636 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, version=9.6)
Jan 21 18:32:09 compute-0 nova_compute[183278]: 2026-01-21 18:32:09.292 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:10 compute-0 nova_compute[183278]: 2026-01-21 18:32:10.895 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:12 compute-0 podman[210169]: 2026-01-21 18:32:12.988262184 +0000 UTC m=+0.045815869 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:32:13 compute-0 podman[210168]: 2026-01-21 18:32:13.043346026 +0000 UTC m=+0.103429462 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 18:32:14 compute-0 nova_compute[183278]: 2026-01-21 18:32:14.295 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:15 compute-0 nova_compute[183278]: 2026-01-21 18:32:15.897 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:18 compute-0 podman[210215]: 2026-01-21 18:32:18.028636251 +0000 UTC m=+0.075619779 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:32:19 compute-0 nova_compute[183278]: 2026-01-21 18:32:19.297 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:20.091 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:20.092 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:20.092 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:20 compute-0 nova_compute[183278]: 2026-01-21 18:32:20.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:32:20 compute-0 nova_compute[183278]: 2026-01-21 18:32:20.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:32:20 compute-0 nova_compute[183278]: 2026-01-21 18:32:20.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:32:20 compute-0 nova_compute[183278]: 2026-01-21 18:32:20.898 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:20 compute-0 nova_compute[183278]: 2026-01-21 18:32:20.985 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-c0c2fb2d-4e32-4586-8725-c987369255b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:32:20 compute-0 nova_compute[183278]: 2026-01-21 18:32:20.986 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-c0c2fb2d-4e32-4586-8725-c987369255b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:32:20 compute-0 nova_compute[183278]: 2026-01-21 18:32:20.986 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:32:20 compute-0 nova_compute[183278]: 2026-01-21 18:32:20.986 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid c0c2fb2d-4e32-4586-8725-c987369255b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:32:22 compute-0 nova_compute[183278]: 2026-01-21 18:32:22.910 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Updating instance_info_cache with network_info: [{"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:32:22 compute-0 nova_compute[183278]: 2026-01-21 18:32:22.935 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-c0c2fb2d-4e32-4586-8725-c987369255b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:32:22 compute-0 nova_compute[183278]: 2026-01-21 18:32:22.935 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:32:22 compute-0 nova_compute[183278]: 2026-01-21 18:32:22.935 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:32:23 compute-0 ovn_controller[95419]: 2026-01-21T18:32:23Z|00142|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 21 18:32:23 compute-0 nova_compute[183278]: 2026-01-21 18:32:23.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:32:23 compute-0 nova_compute[183278]: 2026-01-21 18:32:23.847 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:23 compute-0 nova_compute[183278]: 2026-01-21 18:32:23.847 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:23 compute-0 nova_compute[183278]: 2026-01-21 18:32:23.847 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:23 compute-0 nova_compute[183278]: 2026-01-21 18:32:23.847 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:32:23 compute-0 nova_compute[183278]: 2026-01-21 18:32:23.911 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:32:23 compute-0 nova_compute[183278]: 2026-01-21 18:32:23.982 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:32:23 compute-0 nova_compute[183278]: 2026-01-21 18:32:23.982 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.035 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.195 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.196 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5679MB free_disk=73.35221099853516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.196 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.197 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.299 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.590 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance c0c2fb2d-4e32-4586-8725-c987369255b6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.591 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.591 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.635 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.673 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.856 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:32:24 compute-0 nova_compute[183278]: 2026-01-21 18:32:24.856 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:25 compute-0 nova_compute[183278]: 2026-01-21 18:32:25.851 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:32:25 compute-0 nova_compute[183278]: 2026-01-21 18:32:25.851 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:32:25 compute-0 nova_compute[183278]: 2026-01-21 18:32:25.900 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:27 compute-0 nova_compute[183278]: 2026-01-21 18:32:27.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:32:28 compute-0 nova_compute[183278]: 2026-01-21 18:32:28.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:32:29 compute-0 nova_compute[183278]: 2026-01-21 18:32:29.209 183284 DEBUG nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Creating tmpfile /var/lib/nova/instances/tmp_d3eskxu to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 21 18:32:29 compute-0 nova_compute[183278]: 2026-01-21 18:32:29.300 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:29 compute-0 nova_compute[183278]: 2026-01-21 18:32:29.308 183284 DEBUG nova.compute.manager [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_d3eskxu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 21 18:32:29 compute-0 podman[192560]: time="2026-01-21T18:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:32:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16586 "" "Go-http-client/1.1"
Jan 21 18:32:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Jan 21 18:32:30 compute-0 nova_compute[183278]: 2026-01-21 18:32:30.051 183284 DEBUG nova.compute.manager [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_d3eskxu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7bee7c13-0682-47c7-9cc6-8206b550bfcc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 21 18:32:30 compute-0 nova_compute[183278]: 2026-01-21 18:32:30.078 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-7bee7c13-0682-47c7-9cc6-8206b550bfcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:32:30 compute-0 nova_compute[183278]: 2026-01-21 18:32:30.078 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-7bee7c13-0682-47c7-9cc6-8206b550bfcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:32:30 compute-0 nova_compute[183278]: 2026-01-21 18:32:30.079 183284 DEBUG nova.network.neutron [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:32:30 compute-0 nova_compute[183278]: 2026-01-21 18:32:30.902 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:31 compute-0 openstack_network_exporter[195402]: ERROR   18:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:32:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:32:31 compute-0 openstack_network_exporter[195402]: ERROR   18:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:32:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:32:31 compute-0 nova_compute[183278]: 2026-01-21 18:32:31.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:32:32 compute-0 nova_compute[183278]: 2026-01-21 18:32:32.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:32:32 compute-0 nova_compute[183278]: 2026-01-21 18:32:32.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.770 183284 DEBUG nova.network.neutron [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Updating instance_info_cache with network_info: [{"id": "c70fe92c-1439-4546-9f8b-4471945eef8f", "address": "fa:16:3e:26:16:76", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc70fe92c-14", "ovs_interfaceid": "c70fe92c-1439-4546-9f8b-4471945eef8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.786 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-7bee7c13-0682-47c7-9cc6-8206b550bfcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.787 183284 DEBUG nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_d3eskxu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7bee7c13-0682-47c7-9cc6-8206b550bfcc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.788 183284 DEBUG nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Creating instance directory: /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.788 183284 DEBUG nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Creating disk.info with the contents: {'/var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk': 'qcow2', '/var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.789 183284 DEBUG nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.789 183284 DEBUG nova.objects.instance [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7bee7c13-0682-47c7-9cc6-8206b550bfcc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.868 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.929 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.930 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.931 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.943 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.997 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:32:33 compute-0 nova_compute[183278]: 2026-01-21 18:32:33.998 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.060 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk 1073741824" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.061 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.061 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.116 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.117 183284 DEBUG nova.virt.disk.api [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Checking if we can resize image /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.118 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.170 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.171 183284 DEBUG nova.virt.disk.api [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Cannot resize image /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.172 183284 DEBUG nova.objects.instance [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid 7bee7c13-0682-47c7-9cc6-8206b550bfcc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.222 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.244 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk.config 485376" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.245 183284 DEBUG nova.virt.libvirt.volume.remotefs [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk.config to /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.245 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk.config /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.302 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:34 compute-0 sshd-session[210214]: Connection closed by 206.168.34.126 port 53074 [preauth]
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.654 183284 DEBUG oslo_concurrency.processutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc/disk.config /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.655 183284 DEBUG nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.656 183284 DEBUG nova.virt.libvirt.vif [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:31:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1063391097',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1063391097',id=20,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:31:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-p01v77kw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:31:57Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=7bee7c13-0682-47c7-9cc6-8206b550bfcc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c70fe92c-1439-4546-9f8b-4471945eef8f", "address": "fa:16:3e:26:16:76", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc70fe92c-14", "ovs_interfaceid": "c70fe92c-1439-4546-9f8b-4471945eef8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.657 183284 DEBUG nova.network.os_vif_util [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "c70fe92c-1439-4546-9f8b-4471945eef8f", "address": "fa:16:3e:26:16:76", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc70fe92c-14", "ovs_interfaceid": "c70fe92c-1439-4546-9f8b-4471945eef8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.658 183284 DEBUG nova.network.os_vif_util [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:16:76,bridge_name='br-int',has_traffic_filtering=True,id=c70fe92c-1439-4546-9f8b-4471945eef8f,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc70fe92c-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.658 183284 DEBUG os_vif [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:16:76,bridge_name='br-int',has_traffic_filtering=True,id=c70fe92c-1439-4546-9f8b-4471945eef8f,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc70fe92c-14') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.659 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.659 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.660 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.662 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.662 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc70fe92c-14, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.663 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc70fe92c-14, col_values=(('external_ids', {'iface-id': 'c70fe92c-1439-4546-9f8b-4471945eef8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:16:76', 'vm-uuid': '7bee7c13-0682-47c7-9cc6-8206b550bfcc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.664 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:34 compute-0 NetworkManager[55506]: <info>  [1769020354.6658] manager: (tapc70fe92c-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.667 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.671 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.672 183284 INFO os_vif [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:16:76,bridge_name='br-int',has_traffic_filtering=True,id=c70fe92c-1439-4546-9f8b-4471945eef8f,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc70fe92c-14')
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.673 183284 DEBUG nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 21 18:32:34 compute-0 nova_compute[183278]: 2026-01-21 18:32:34.673 183284 DEBUG nova.compute.manager [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_d3eskxu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7bee7c13-0682-47c7-9cc6-8206b550bfcc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 21 18:32:35 compute-0 nova_compute[183278]: 2026-01-21 18:32:35.904 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:36 compute-0 nova_compute[183278]: 2026-01-21 18:32:36.636 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:36.637 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:32:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:36.638 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:32:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:37.640 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:37 compute-0 nova_compute[183278]: 2026-01-21 18:32:37.747 183284 DEBUG nova.network.neutron [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Port c70fe92c-1439-4546-9f8b-4471945eef8f updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 21 18:32:37 compute-0 nova_compute[183278]: 2026-01-21 18:32:37.748 183284 DEBUG nova.compute.manager [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_d3eskxu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7bee7c13-0682-47c7-9cc6-8206b550bfcc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 21 18:32:37 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 21 18:32:37 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 21 18:32:37 compute-0 podman[210270]: 2026-01-21 18:32:37.977442851 +0000 UTC m=+0.057089550 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64)
Jan 21 18:32:38 compute-0 NetworkManager[55506]: <info>  [1769020358.0669] manager: (tapc70fe92c-14): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Jan 21 18:32:38 compute-0 kernel: tapc70fe92c-14: entered promiscuous mode
Jan 21 18:32:38 compute-0 ovn_controller[95419]: 2026-01-21T18:32:38Z|00143|binding|INFO|Claiming lport c70fe92c-1439-4546-9f8b-4471945eef8f for this additional chassis.
Jan 21 18:32:38 compute-0 ovn_controller[95419]: 2026-01-21T18:32:38Z|00144|binding|INFO|c70fe92c-1439-4546-9f8b-4471945eef8f: Claiming fa:16:3e:26:16:76 10.100.0.4
Jan 21 18:32:38 compute-0 nova_compute[183278]: 2026-01-21 18:32:38.069 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:38 compute-0 ovn_controller[95419]: 2026-01-21T18:32:38Z|00145|binding|INFO|Setting lport c70fe92c-1439-4546-9f8b-4471945eef8f ovn-installed in OVS
Jan 21 18:32:38 compute-0 nova_compute[183278]: 2026-01-21 18:32:38.083 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:38 compute-0 nova_compute[183278]: 2026-01-21 18:32:38.085 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:38 compute-0 nova_compute[183278]: 2026-01-21 18:32:38.088 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:38 compute-0 systemd-udevd[210324]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:32:38 compute-0 systemd-machined[154592]: New machine qemu-14-instance-00000014.
Jan 21 18:32:38 compute-0 NetworkManager[55506]: <info>  [1769020358.1085] device (tapc70fe92c-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:32:38 compute-0 NetworkManager[55506]: <info>  [1769020358.1090] device (tapc70fe92c-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:32:38 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000014.
Jan 21 18:32:38 compute-0 nova_compute[183278]: 2026-01-21 18:32:38.578 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020358.5784547, 7bee7c13-0682-47c7-9cc6-8206b550bfcc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:32:38 compute-0 nova_compute[183278]: 2026-01-21 18:32:38.579 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] VM Started (Lifecycle Event)
Jan 21 18:32:38 compute-0 nova_compute[183278]: 2026-01-21 18:32:38.598 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:32:39 compute-0 nova_compute[183278]: 2026-01-21 18:32:39.460 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020359.4602134, 7bee7c13-0682-47c7-9cc6-8206b550bfcc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:32:39 compute-0 nova_compute[183278]: 2026-01-21 18:32:39.461 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] VM Resumed (Lifecycle Event)
Jan 21 18:32:39 compute-0 nova_compute[183278]: 2026-01-21 18:32:39.482 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:32:39 compute-0 nova_compute[183278]: 2026-01-21 18:32:39.485 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:32:39 compute-0 nova_compute[183278]: 2026-01-21 18:32:39.505 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 21 18:32:39 compute-0 nova_compute[183278]: 2026-01-21 18:32:39.665 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:40 compute-0 nova_compute[183278]: 2026-01-21 18:32:40.905 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:40 compute-0 ovn_controller[95419]: 2026-01-21T18:32:40Z|00146|binding|INFO|Claiming lport c70fe92c-1439-4546-9f8b-4471945eef8f for this chassis.
Jan 21 18:32:40 compute-0 ovn_controller[95419]: 2026-01-21T18:32:40Z|00147|binding|INFO|c70fe92c-1439-4546-9f8b-4471945eef8f: Claiming fa:16:3e:26:16:76 10.100.0.4
Jan 21 18:32:40 compute-0 ovn_controller[95419]: 2026-01-21T18:32:40Z|00148|binding|INFO|Setting lport c70fe92c-1439-4546-9f8b-4471945eef8f up in Southbound
Jan 21 18:32:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:40.981 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:16:76 10.100.0.4'], port_security=['fa:16:3e:26:16:76 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7bee7c13-0682-47c7-9cc6-8206b550bfcc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '11', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=c70fe92c-1439-4546-9f8b-4471945eef8f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:32:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:40.983 104698 INFO neutron.agent.ovn.metadata.agent [-] Port c70fe92c-1439-4546-9f8b-4471945eef8f in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf bound to our chassis
Jan 21 18:32:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:40.984 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:32:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:40.998 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[cefe5a30-eb07-42fe-8d87-0d5c98deb468]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.028 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[c27c983b-7fbd-4f3e-b21f-973104b3ae2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.030 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b52cd9-eca6-47a4-85c5-c1ce203cac51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.061 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[ae82e439-3abc-4cd7-b2c8-a8b1eb4d52ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.079 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb33491-2d40-4912-866f-764c8fc2f52c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486054, 'reachable_time': 26720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210361, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.096 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[49dc54cb-0e42-486a-98fa-d26f516f2d95]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap405ec01b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486064, 'tstamp': 486064}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210362, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap405ec01b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486068, 'tstamp': 486068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210362, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.097 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:41 compute-0 nova_compute[183278]: 2026-01-21 18:32:41.099 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.100 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.101 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.101 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:41.102 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:32:42 compute-0 nova_compute[183278]: 2026-01-21 18:32:42.109 183284 INFO nova.compute.manager [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Post operation of migration started
Jan 21 18:32:42 compute-0 nova_compute[183278]: 2026-01-21 18:32:42.662 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-7bee7c13-0682-47c7-9cc6-8206b550bfcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:32:42 compute-0 nova_compute[183278]: 2026-01-21 18:32:42.663 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-7bee7c13-0682-47c7-9cc6-8206b550bfcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:32:42 compute-0 nova_compute[183278]: 2026-01-21 18:32:42.663 183284 DEBUG nova.network.neutron [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:32:43 compute-0 nova_compute[183278]: 2026-01-21 18:32:43.747 183284 DEBUG nova.network.neutron [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Updating instance_info_cache with network_info: [{"id": "c70fe92c-1439-4546-9f8b-4471945eef8f", "address": "fa:16:3e:26:16:76", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc70fe92c-14", "ovs_interfaceid": "c70fe92c-1439-4546-9f8b-4471945eef8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:32:43 compute-0 nova_compute[183278]: 2026-01-21 18:32:43.776 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-7bee7c13-0682-47c7-9cc6-8206b550bfcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:32:43 compute-0 nova_compute[183278]: 2026-01-21 18:32:43.792 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:43 compute-0 nova_compute[183278]: 2026-01-21 18:32:43.793 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:43 compute-0 nova_compute[183278]: 2026-01-21 18:32:43.793 183284 DEBUG oslo_concurrency.lockutils [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:43 compute-0 nova_compute[183278]: 2026-01-21 18:32:43.797 183284 INFO nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 21 18:32:43 compute-0 virtqemud[182681]: Domain id=14 name='instance-00000014' uuid=7bee7c13-0682-47c7-9cc6-8206b550bfcc is tainted: custom-monitor
Jan 21 18:32:44 compute-0 podman[210364]: 2026-01-21 18:32:44.01359218 +0000 UTC m=+0.057453430 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:32:44 compute-0 podman[210363]: 2026-01-21 18:32:44.044345234 +0000 UTC m=+0.086555003 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 18:32:44 compute-0 nova_compute[183278]: 2026-01-21 18:32:44.669 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:44 compute-0 nova_compute[183278]: 2026-01-21 18:32:44.804 183284 INFO nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 21 18:32:45 compute-0 nova_compute[183278]: 2026-01-21 18:32:45.809 183284 INFO nova.virt.libvirt.driver [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 21 18:32:45 compute-0 nova_compute[183278]: 2026-01-21 18:32:45.814 183284 DEBUG nova.compute.manager [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:32:45 compute-0 nova_compute[183278]: 2026-01-21 18:32:45.910 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:46 compute-0 nova_compute[183278]: 2026-01-21 18:32:46.161 183284 DEBUG nova.objects.instance [None req-c3dc7a2a-afa2-47d1-8bc8-c71ef9a28fc1 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 18:32:48 compute-0 podman[210405]: 2026-01-21 18:32:48.997432629 +0000 UTC m=+0.052361207 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:32:49 compute-0 nova_compute[183278]: 2026-01-21 18:32:49.671 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:50 compute-0 nova_compute[183278]: 2026-01-21 18:32:50.915 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.493 183284 DEBUG oslo_concurrency.lockutils [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.493 183284 DEBUG oslo_concurrency.lockutils [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.494 183284 DEBUG oslo_concurrency.lockutils [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.494 183284 DEBUG oslo_concurrency.lockutils [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.495 183284 DEBUG oslo_concurrency.lockutils [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.496 183284 INFO nova.compute.manager [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Terminating instance
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.497 183284 DEBUG nova.compute.manager [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:32:51 compute-0 kernel: tapc70fe92c-14 (unregistering): left promiscuous mode
Jan 21 18:32:51 compute-0 NetworkManager[55506]: <info>  [1769020371.5261] device (tapc70fe92c-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:32:51 compute-0 ovn_controller[95419]: 2026-01-21T18:32:51Z|00149|binding|INFO|Releasing lport c70fe92c-1439-4546-9f8b-4471945eef8f from this chassis (sb_readonly=0)
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.535 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 ovn_controller[95419]: 2026-01-21T18:32:51Z|00150|binding|INFO|Setting lport c70fe92c-1439-4546-9f8b-4471945eef8f down in Southbound
Jan 21 18:32:51 compute-0 ovn_controller[95419]: 2026-01-21T18:32:51Z|00151|binding|INFO|Removing iface tapc70fe92c-14 ovn-installed in OVS
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.537 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.543 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:16:76 10.100.0.4'], port_security=['fa:16:3e:26:16:76 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7bee7c13-0682-47c7-9cc6-8206b550bfcc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '13', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=c70fe92c-1439-4546-9f8b-4471945eef8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.544 104698 INFO neutron.agent.ovn.metadata.agent [-] Port c70fe92c-1439-4546-9f8b-4471945eef8f in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf unbound from our chassis
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.545 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.548 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.561 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2ced7f07-1449-4ff1-ba3a-4d13cfa82285]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.591 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb0ffb0-a896-40e6-8dee-3646af2c4e82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.594 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[09c0a202-c401-4b3c-8db5-20c57fc2773b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:51 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 21 18:32:51 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000014.scope: Consumed 1.467s CPU time.
Jan 21 18:32:51 compute-0 systemd-machined[154592]: Machine qemu-14-instance-00000014 terminated.
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.624 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[270e0af7-066b-46df-85c0-0c1f3579306c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.641 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4a231521-7b95-4a3e-baf7-9943b777c7da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486054, 'reachable_time': 26720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210441, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.658 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[26d6d52d-5e97-46ae-af68-fcb4dd9efc03]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap405ec01b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486064, 'tstamp': 486064}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210442, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap405ec01b-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486068, 'tstamp': 486068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210442, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.660 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.662 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.666 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.667 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.668 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.668 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:51.668 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.717 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.722 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.759 183284 INFO nova.virt.libvirt.driver [-] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Instance destroyed successfully.
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.760 183284 DEBUG nova.objects.instance [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'resources' on Instance uuid 7bee7c13-0682-47c7-9cc6-8206b550bfcc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.775 183284 DEBUG nova.virt.libvirt.vif [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T18:31:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1063391097',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1063391097',id=20,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:31:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-p01v77kw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:32:46Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=7bee7c13-0682-47c7-9cc6-8206b550bfcc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c70fe92c-1439-4546-9f8b-4471945eef8f", "address": "fa:16:3e:26:16:76", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc70fe92c-14", "ovs_interfaceid": "c70fe92c-1439-4546-9f8b-4471945eef8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.775 183284 DEBUG nova.network.os_vif_util [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "c70fe92c-1439-4546-9f8b-4471945eef8f", "address": "fa:16:3e:26:16:76", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc70fe92c-14", "ovs_interfaceid": "c70fe92c-1439-4546-9f8b-4471945eef8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.776 183284 DEBUG nova.network.os_vif_util [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:16:76,bridge_name='br-int',has_traffic_filtering=True,id=c70fe92c-1439-4546-9f8b-4471945eef8f,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc70fe92c-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.777 183284 DEBUG os_vif [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:16:76,bridge_name='br-int',has_traffic_filtering=True,id=c70fe92c-1439-4546-9f8b-4471945eef8f,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc70fe92c-14') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.779 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.779 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc70fe92c-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.781 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.781 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.784 183284 INFO os_vif [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:16:76,bridge_name='br-int',has_traffic_filtering=True,id=c70fe92c-1439-4546-9f8b-4471945eef8f,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc70fe92c-14')
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.785 183284 INFO nova.virt.libvirt.driver [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Deleting instance files /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc_del
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.786 183284 INFO nova.virt.libvirt.driver [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Deletion of /var/lib/nova/instances/7bee7c13-0682-47c7-9cc6-8206b550bfcc_del complete
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.803 183284 DEBUG nova.compute.manager [req-8d30d61b-abe5-4006-b787-33fd7b3a198d req-5534e43c-37bf-4fa8-860a-cfddb5527931 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Received event network-vif-unplugged-c70fe92c-1439-4546-9f8b-4471945eef8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.803 183284 DEBUG oslo_concurrency.lockutils [req-8d30d61b-abe5-4006-b787-33fd7b3a198d req-5534e43c-37bf-4fa8-860a-cfddb5527931 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.803 183284 DEBUG oslo_concurrency.lockutils [req-8d30d61b-abe5-4006-b787-33fd7b3a198d req-5534e43c-37bf-4fa8-860a-cfddb5527931 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.804 183284 DEBUG oslo_concurrency.lockutils [req-8d30d61b-abe5-4006-b787-33fd7b3a198d req-5534e43c-37bf-4fa8-860a-cfddb5527931 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.804 183284 DEBUG nova.compute.manager [req-8d30d61b-abe5-4006-b787-33fd7b3a198d req-5534e43c-37bf-4fa8-860a-cfddb5527931 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] No waiting events found dispatching network-vif-unplugged-c70fe92c-1439-4546-9f8b-4471945eef8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:32:51 compute-0 nova_compute[183278]: 2026-01-21 18:32:51.804 183284 DEBUG nova.compute.manager [req-8d30d61b-abe5-4006-b787-33fd7b3a198d req-5534e43c-37bf-4fa8-860a-cfddb5527931 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Received event network-vif-unplugged-c70fe92c-1439-4546-9f8b-4471945eef8f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.084 183284 INFO nova.compute.manager [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Took 0.59 seconds to destroy the instance on the hypervisor.
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.085 183284 DEBUG oslo.service.loopingcall [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.085 183284 DEBUG nova.compute.manager [-] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.085 183284 DEBUG nova.network.neutron [-] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.645 183284 DEBUG nova.network.neutron [-] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.661 183284 INFO nova.compute.manager [-] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Took 0.58 seconds to deallocate network for instance.
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.706 183284 DEBUG oslo_concurrency.lockutils [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.707 183284 DEBUG oslo_concurrency.lockutils [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.713 183284 DEBUG oslo_concurrency.lockutils [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.762 183284 INFO nova.scheduler.client.report [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Deleted allocations for instance 7bee7c13-0682-47c7-9cc6-8206b550bfcc
Jan 21 18:32:52 compute-0 nova_compute[183278]: 2026-01-21 18:32:52.827 183284 DEBUG oslo_concurrency.lockutils [None req-9f1d62ad-6bd3-4fad-ac41-ecdc2b221bb8 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.471 183284 DEBUG oslo_concurrency.lockutils [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "c0c2fb2d-4e32-4586-8725-c987369255b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.472 183284 DEBUG oslo_concurrency.lockutils [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.472 183284 DEBUG oslo_concurrency.lockutils [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.472 183284 DEBUG oslo_concurrency.lockutils [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.473 183284 DEBUG oslo_concurrency.lockutils [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.474 183284 INFO nova.compute.manager [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Terminating instance
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.474 183284 DEBUG nova.compute.manager [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:32:53 compute-0 kernel: tap23b1c1c6-59 (unregistering): left promiscuous mode
Jan 21 18:32:53 compute-0 NetworkManager[55506]: <info>  [1769020373.5048] device (tap23b1c1c6-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.514 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:53 compute-0 ovn_controller[95419]: 2026-01-21T18:32:53Z|00152|binding|INFO|Releasing lport 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 from this chassis (sb_readonly=0)
Jan 21 18:32:53 compute-0 ovn_controller[95419]: 2026-01-21T18:32:53Z|00153|binding|INFO|Setting lport 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 down in Southbound
Jan 21 18:32:53 compute-0 ovn_controller[95419]: 2026-01-21T18:32:53Z|00154|binding|INFO|Removing iface tap23b1c1c6-59 ovn-installed in OVS
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.517 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.528 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:2a:51 10.100.0.8'], port_security=['fa:16:3e:1e:2a:51 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'c0c2fb2d-4e32-4586-8725-c987369255b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=23b1c1c6-59de-47af-98b4-6d28c3f94ff3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.529 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 23b1c1c6-59de-47af-98b4-6d28c3f94ff3 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf unbound from our chassis
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.530 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.531 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.530 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4763ab5e-6ffe-4858-a669-ab735562c51b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.531 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace which is not needed anymore
Jan 21 18:32:53 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 21 18:32:53 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Consumed 15.868s CPU time.
Jan 21 18:32:53 compute-0 systemd-machined[154592]: Machine qemu-13-instance-00000013 terminated.
Jan 21 18:32:53 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210043]: [NOTICE]   (210047) : haproxy version is 2.8.14-c23fe91
Jan 21 18:32:53 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210043]: [NOTICE]   (210047) : path to executable is /usr/sbin/haproxy
Jan 21 18:32:53 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210043]: [WARNING]  (210047) : Exiting Master process...
Jan 21 18:32:53 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210043]: [ALERT]    (210047) : Current worker (210049) exited with code 143 (Terminated)
Jan 21 18:32:53 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210043]: [WARNING]  (210047) : All workers exited. Exiting... (0)
Jan 21 18:32:53 compute-0 systemd[1]: libpod-ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829.scope: Deactivated successfully.
Jan 21 18:32:53 compute-0 podman[210482]: 2026-01-21 18:32:53.663603069 +0000 UTC m=+0.042979671 container died ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 18:32:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829-userdata-shm.mount: Deactivated successfully.
Jan 21 18:32:53 compute-0 NetworkManager[55506]: <info>  [1769020373.6908] manager: (tap23b1c1c6-59): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Jan 21 18:32:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-2f54d3ca2ab1a4670fd6c48c83e2af08ea3bdb66afec45da15317ad843a39e48-merged.mount: Deactivated successfully.
Jan 21 18:32:53 compute-0 podman[210482]: 2026-01-21 18:32:53.700851079 +0000 UTC m=+0.080227681 container cleanup ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:32:53 compute-0 systemd[1]: libpod-conmon-ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829.scope: Deactivated successfully.
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.730 183284 INFO nova.virt.libvirt.driver [-] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Instance destroyed successfully.
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.732 183284 DEBUG nova.objects.instance [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'resources' on Instance uuid c0c2fb2d-4e32-4586-8725-c987369255b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.746 183284 DEBUG nova.virt.libvirt.vif [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:31:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-746758313',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-746758313',id=19,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:31:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-9qo34i17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:31:37Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=c0c2fb2d-4e32-4586-8725-c987369255b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.746 183284 DEBUG nova.network.os_vif_util [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "address": "fa:16:3e:1e:2a:51", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23b1c1c6-59", "ovs_interfaceid": "23b1c1c6-59de-47af-98b4-6d28c3f94ff3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.747 183284 DEBUG nova.network.os_vif_util [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:2a:51,bridge_name='br-int',has_traffic_filtering=True,id=23b1c1c6-59de-47af-98b4-6d28c3f94ff3,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23b1c1c6-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.747 183284 DEBUG os_vif [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:2a:51,bridge_name='br-int',has_traffic_filtering=True,id=23b1c1c6-59de-47af-98b4-6d28c3f94ff3,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23b1c1c6-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.748 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.749 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23b1c1c6-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.750 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.752 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.754 183284 INFO os_vif [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:2a:51,bridge_name='br-int',has_traffic_filtering=True,id=23b1c1c6-59de-47af-98b4-6d28c3f94ff3,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23b1c1c6-59')
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.755 183284 INFO nova.virt.libvirt.driver [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Deleting instance files /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6_del
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.755 183284 INFO nova.virt.libvirt.driver [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Deletion of /var/lib/nova/instances/c0c2fb2d-4e32-4586-8725-c987369255b6_del complete
Jan 21 18:32:53 compute-0 podman[210523]: 2026-01-21 18:32:53.771575579 +0000 UTC m=+0.046183748 container remove ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.775 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9eaff927-9edb-49c9-8514-d442d55bd5f0]: (4, ('Wed Jan 21 06:32:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829)\nec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829\nWed Jan 21 06:32:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (ec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829)\nec7b359feada0b8a30bfecb70489fa2e682125fd69f4fe379717f9c3b8aa4829\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.777 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0fce13da-6109-4f3d-9902-2774c407e184]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.778 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.779 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:53 compute-0 kernel: tap405ec01b-70: left promiscuous mode
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.790 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.791 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.792 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b2bb8941-d84a-41ed-b414-a16882ce303a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.805 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[099104af-38f1-4e20-8d37-e537e84fd3c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.806 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b466b58e-a6c2-47f5-b2ae-cf9725abfbcc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.812 183284 INFO nova.compute.manager [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.813 183284 DEBUG oslo.service.loopingcall [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.813 183284 DEBUG nova.compute.manager [-] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.813 183284 DEBUG nova.network.neutron [-] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.820 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[45562e5b-2cca-44cc-9d40-26fe93f1bb34]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486047, 'reachable_time': 16495, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210541, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.823 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:32:53 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:32:53.823 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[56fffa84-2875-4ab6-bccf-c33220aca2ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:32:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d405ec01b\x2d76d3\x2d4c3c\x2da31b\x2d5f16d9641fbf.mount: Deactivated successfully.
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.901 183284 DEBUG nova.compute.manager [req-3eae283b-1f64-494f-9920-adb3347b538a req-484d619e-b8ef-4ebb-adf4-e31619a69340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Received event network-vif-plugged-c70fe92c-1439-4546-9f8b-4471945eef8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.902 183284 DEBUG oslo_concurrency.lockutils [req-3eae283b-1f64-494f-9920-adb3347b538a req-484d619e-b8ef-4ebb-adf4-e31619a69340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.902 183284 DEBUG oslo_concurrency.lockutils [req-3eae283b-1f64-494f-9920-adb3347b538a req-484d619e-b8ef-4ebb-adf4-e31619a69340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.903 183284 DEBUG oslo_concurrency.lockutils [req-3eae283b-1f64-494f-9920-adb3347b538a req-484d619e-b8ef-4ebb-adf4-e31619a69340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "7bee7c13-0682-47c7-9cc6-8206b550bfcc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.903 183284 DEBUG nova.compute.manager [req-3eae283b-1f64-494f-9920-adb3347b538a req-484d619e-b8ef-4ebb-adf4-e31619a69340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] No waiting events found dispatching network-vif-plugged-c70fe92c-1439-4546-9f8b-4471945eef8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.903 183284 WARNING nova.compute.manager [req-3eae283b-1f64-494f-9920-adb3347b538a req-484d619e-b8ef-4ebb-adf4-e31619a69340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Received unexpected event network-vif-plugged-c70fe92c-1439-4546-9f8b-4471945eef8f for instance with vm_state deleted and task_state None.
Jan 21 18:32:53 compute-0 nova_compute[183278]: 2026-01-21 18:32:53.904 183284 DEBUG nova.compute.manager [req-3eae283b-1f64-494f-9920-adb3347b538a req-484d619e-b8ef-4ebb-adf4-e31619a69340 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Received event network-vif-deleted-c70fe92c-1439-4546-9f8b-4471945eef8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:32:55 compute-0 nova_compute[183278]: 2026-01-21 18:32:55.916 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:56 compute-0 nova_compute[183278]: 2026-01-21 18:32:56.700 183284 DEBUG nova.compute.manager [req-cf217b2f-8e5b-4e71-ba43-5090f02cc89e req-e26d3c57-dce9-4d6c-8066-6c0fc3cecb51 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Received event network-vif-unplugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:32:56 compute-0 nova_compute[183278]: 2026-01-21 18:32:56.700 183284 DEBUG oslo_concurrency.lockutils [req-cf217b2f-8e5b-4e71-ba43-5090f02cc89e req-e26d3c57-dce9-4d6c-8066-6c0fc3cecb51 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:56 compute-0 nova_compute[183278]: 2026-01-21 18:32:56.701 183284 DEBUG oslo_concurrency.lockutils [req-cf217b2f-8e5b-4e71-ba43-5090f02cc89e req-e26d3c57-dce9-4d6c-8066-6c0fc3cecb51 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:56 compute-0 nova_compute[183278]: 2026-01-21 18:32:56.701 183284 DEBUG oslo_concurrency.lockutils [req-cf217b2f-8e5b-4e71-ba43-5090f02cc89e req-e26d3c57-dce9-4d6c-8066-6c0fc3cecb51 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:56 compute-0 nova_compute[183278]: 2026-01-21 18:32:56.701 183284 DEBUG nova.compute.manager [req-cf217b2f-8e5b-4e71-ba43-5090f02cc89e req-e26d3c57-dce9-4d6c-8066-6c0fc3cecb51 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] No waiting events found dispatching network-vif-unplugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:32:56 compute-0 nova_compute[183278]: 2026-01-21 18:32:56.701 183284 DEBUG nova.compute.manager [req-cf217b2f-8e5b-4e71-ba43-5090f02cc89e req-e26d3c57-dce9-4d6c-8066-6c0fc3cecb51 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Received event network-vif-unplugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:32:58 compute-0 nova_compute[183278]: 2026-01-21 18:32:58.750 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:32:58 compute-0 nova_compute[183278]: 2026-01-21 18:32:58.798 183284 DEBUG nova.compute.manager [req-33f9f40e-3bcf-4956-8aec-d5da36954fbf req-e6752e14-dab9-421c-a487-30d3577b36e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Received event network-vif-plugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:32:58 compute-0 nova_compute[183278]: 2026-01-21 18:32:58.798 183284 DEBUG oslo_concurrency.lockutils [req-33f9f40e-3bcf-4956-8aec-d5da36954fbf req-e6752e14-dab9-421c-a487-30d3577b36e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:58 compute-0 nova_compute[183278]: 2026-01-21 18:32:58.799 183284 DEBUG oslo_concurrency.lockutils [req-33f9f40e-3bcf-4956-8aec-d5da36954fbf req-e6752e14-dab9-421c-a487-30d3577b36e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:58 compute-0 nova_compute[183278]: 2026-01-21 18:32:58.799 183284 DEBUG oslo_concurrency.lockutils [req-33f9f40e-3bcf-4956-8aec-d5da36954fbf req-e6752e14-dab9-421c-a487-30d3577b36e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:58 compute-0 nova_compute[183278]: 2026-01-21 18:32:58.799 183284 DEBUG nova.compute.manager [req-33f9f40e-3bcf-4956-8aec-d5da36954fbf req-e6752e14-dab9-421c-a487-30d3577b36e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] No waiting events found dispatching network-vif-plugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:32:58 compute-0 nova_compute[183278]: 2026-01-21 18:32:58.799 183284 WARNING nova.compute.manager [req-33f9f40e-3bcf-4956-8aec-d5da36954fbf req-e6752e14-dab9-421c-a487-30d3577b36e0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Received unexpected event network-vif-plugged-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 for instance with vm_state active and task_state deleting.
Jan 21 18:32:59 compute-0 nova_compute[183278]: 2026-01-21 18:32:59.642 183284 DEBUG nova.network.neutron [-] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:32:59 compute-0 nova_compute[183278]: 2026-01-21 18:32:59.657 183284 INFO nova.compute.manager [-] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Took 5.84 seconds to deallocate network for instance.
Jan 21 18:32:59 compute-0 nova_compute[183278]: 2026-01-21 18:32:59.697 183284 DEBUG oslo_concurrency.lockutils [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:32:59 compute-0 nova_compute[183278]: 2026-01-21 18:32:59.697 183284 DEBUG oslo_concurrency.lockutils [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:32:59 compute-0 podman[192560]: time="2026-01-21T18:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:32:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:32:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Jan 21 18:32:59 compute-0 nova_compute[183278]: 2026-01-21 18:32:59.756 183284 DEBUG nova.compute.provider_tree [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:32:59 compute-0 nova_compute[183278]: 2026-01-21 18:32:59.777 183284 DEBUG nova.scheduler.client.report [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:32:59 compute-0 nova_compute[183278]: 2026-01-21 18:32:59.803 183284 DEBUG oslo_concurrency.lockutils [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:32:59 compute-0 nova_compute[183278]: 2026-01-21 18:32:59.833 183284 INFO nova.scheduler.client.report [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Deleted allocations for instance c0c2fb2d-4e32-4586-8725-c987369255b6
Jan 21 18:32:59 compute-0 nova_compute[183278]: 2026-01-21 18:32:59.912 183284 DEBUG oslo_concurrency.lockutils [None req-96380efd-a69b-4a12-9231-988a76eec70d 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "c0c2fb2d-4e32-4586-8725-c987369255b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:00 compute-0 nova_compute[183278]: 2026-01-21 18:33:00.866 183284 DEBUG nova.compute.manager [req-bd4e36b0-03c4-4090-be78-760b85669032 req-26bf540c-c050-4a31-abe4-e13c40392f13 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Received event network-vif-deleted-23b1c1c6-59de-47af-98b4-6d28c3f94ff3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:33:00 compute-0 nova_compute[183278]: 2026-01-21 18:33:00.917 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:01 compute-0 openstack_network_exporter[195402]: ERROR   18:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:33:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:33:01 compute-0 openstack_network_exporter[195402]: ERROR   18:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:33:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:33:03 compute-0 nova_compute[183278]: 2026-01-21 18:33:03.752 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:05 compute-0 nova_compute[183278]: 2026-01-21 18:33:05.919 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:06 compute-0 nova_compute[183278]: 2026-01-21 18:33:06.758 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020371.756605, 7bee7c13-0682-47c7-9cc6-8206b550bfcc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:33:06 compute-0 nova_compute[183278]: 2026-01-21 18:33:06.759 183284 INFO nova.compute.manager [-] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] VM Stopped (Lifecycle Event)
Jan 21 18:33:07 compute-0 nova_compute[183278]: 2026-01-21 18:33:07.057 183284 DEBUG nova.compute.manager [None req-ee4fd7e7-594b-4a23-8016-84f194c21a79 - - - - - -] [instance: 7bee7c13-0682-47c7-9cc6-8206b550bfcc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:33:08 compute-0 nova_compute[183278]: 2026-01-21 18:33:08.730 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020373.729215, c0c2fb2d-4e32-4586-8725-c987369255b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:33:08 compute-0 nova_compute[183278]: 2026-01-21 18:33:08.730 183284 INFO nova.compute.manager [-] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] VM Stopped (Lifecycle Event)
Jan 21 18:33:08 compute-0 nova_compute[183278]: 2026-01-21 18:33:08.750 183284 DEBUG nova.compute.manager [None req-d40f0b1f-f594-43e5-be7b-5e20edaa1ce5 - - - - - -] [instance: c0c2fb2d-4e32-4586-8725-c987369255b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:33:08 compute-0 nova_compute[183278]: 2026-01-21 18:33:08.754 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:09 compute-0 podman[210542]: 2026-01-21 18:33:09.006964971 +0000 UTC m=+0.064532801 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6)
Jan 21 18:33:10 compute-0 nova_compute[183278]: 2026-01-21 18:33:10.920 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:13 compute-0 nova_compute[183278]: 2026-01-21 18:33:13.756 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:14 compute-0 podman[210564]: 2026-01-21 18:33:14.995282294 +0000 UTC m=+0.050554763 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 21 18:33:15 compute-0 podman[210563]: 2026-01-21 18:33:15.012440609 +0000 UTC m=+0.071859288 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 18:33:15 compute-0 nova_compute[183278]: 2026-01-21 18:33:15.926 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:18 compute-0 nova_compute[183278]: 2026-01-21 18:33:18.758 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:19 compute-0 podman[210608]: 2026-01-21 18:33:19.989393801 +0000 UTC m=+0.048641108 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:33:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:20.093 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:20.094 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:20.094 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:20 compute-0 nova_compute[183278]: 2026-01-21 18:33:20.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:20 compute-0 nova_compute[183278]: 2026-01-21 18:33:20.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:33:20 compute-0 nova_compute[183278]: 2026-01-21 18:33:20.819 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:33:20 compute-0 nova_compute[183278]: 2026-01-21 18:33:20.930 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:23 compute-0 nova_compute[183278]: 2026-01-21 18:33:23.394 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:33:23 compute-0 nova_compute[183278]: 2026-01-21 18:33:23.394 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:23 compute-0 nova_compute[183278]: 2026-01-21 18:33:23.760 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:25 compute-0 nova_compute[183278]: 2026-01-21 18:33:25.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:25 compute-0 nova_compute[183278]: 2026-01-21 18:33:25.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:25 compute-0 nova_compute[183278]: 2026-01-21 18:33:25.930 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:25 compute-0 nova_compute[183278]: 2026-01-21 18:33:25.930 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:25 compute-0 nova_compute[183278]: 2026-01-21 18:33:25.930 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:25 compute-0 nova_compute[183278]: 2026-01-21 18:33:25.930 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:33:25 compute-0 nova_compute[183278]: 2026-01-21 18:33:25.931 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.080 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.081 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5841MB free_disk=73.38093948364258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.081 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.081 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.134 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.135 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.151 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing inventories for resource provider 502e4243-611b-433d-a766-9b485d51652d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.169 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating ProviderTree inventory for provider 502e4243-611b-433d-a766-9b485d51652d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.169 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.183 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing aggregate associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.205 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing trait associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.234 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.253 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.274 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:33:26 compute-0 nova_compute[183278]: 2026-01-21 18:33:26.274 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:27 compute-0 nova_compute[183278]: 2026-01-21 18:33:27.272 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:27 compute-0 nova_compute[183278]: 2026-01-21 18:33:27.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:28 compute-0 ovn_controller[95419]: 2026-01-21T18:33:28Z|00155|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Jan 21 18:33:28 compute-0 nova_compute[183278]: 2026-01-21 18:33:28.761 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:28 compute-0 nova_compute[183278]: 2026-01-21 18:33:28.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:29 compute-0 podman[192560]: time="2026-01-21T18:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:33:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:33:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 21 18:33:29 compute-0 nova_compute[183278]: 2026-01-21 18:33:29.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:30 compute-0 nova_compute[183278]: 2026-01-21 18:33:30.933 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:31 compute-0 openstack_network_exporter[195402]: ERROR   18:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:33:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:33:31 compute-0 openstack_network_exporter[195402]: ERROR   18:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:33:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:33:32 compute-0 nova_compute[183278]: 2026-01-21 18:33:32.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:33 compute-0 nova_compute[183278]: 2026-01-21 18:33:33.762 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:34 compute-0 nova_compute[183278]: 2026-01-21 18:33:34.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:33:34 compute-0 nova_compute[183278]: 2026-01-21 18:33:34.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:33:35 compute-0 nova_compute[183278]: 2026-01-21 18:33:35.935 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:35 compute-0 sshd-session[210634]: Invalid user solana from 64.227.98.100 port 34016
Jan 21 18:33:36 compute-0 sshd-session[210634]: Connection closed by invalid user solana 64.227.98.100 port 34016 [preauth]
Jan 21 18:33:38 compute-0 nova_compute[183278]: 2026-01-21 18:33:38.765 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:40 compute-0 podman[210636]: 2026-01-21 18:33:40.000407849 +0000 UTC m=+0.055528322 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 21 18:33:40 compute-0 nova_compute[183278]: 2026-01-21 18:33:40.908 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:40 compute-0 nova_compute[183278]: 2026-01-21 18:33:40.908 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:40 compute-0 nova_compute[183278]: 2026-01-21 18:33:40.921 183284 DEBUG nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:33:40 compute-0 nova_compute[183278]: 2026-01-21 18:33:40.936 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:40 compute-0 nova_compute[183278]: 2026-01-21 18:33:40.985 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:40 compute-0 nova_compute[183278]: 2026-01-21 18:33:40.986 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:40 compute-0 nova_compute[183278]: 2026-01-21 18:33:40.992 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:33:40 compute-0 nova_compute[183278]: 2026-01-21 18:33:40.992 183284 INFO nova.compute.claims [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.099 183284 DEBUG nova.compute.provider_tree [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.116 183284 DEBUG nova.scheduler.client.report [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.135 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.135 183284 DEBUG nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.174 183284 DEBUG nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.175 183284 DEBUG nova.network.neutron [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.195 183284 INFO nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.218 183284 DEBUG nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.326 183284 DEBUG nova.policy [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41dc6e790bc54fbfaf5c6007d3fa5f63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.329 183284 DEBUG nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.330 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.331 183284 INFO nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Creating image(s)
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.331 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "/var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.331 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.332 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.343 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.396 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.397 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.398 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.409 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.471 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.472 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.520 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.522 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.522 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.585 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.586 183284 DEBUG nova.virt.disk.api [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Checking if we can resize image /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.586 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.648 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.649 183284 DEBUG nova.virt.disk.api [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Cannot resize image /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.649 183284 DEBUG nova.objects.instance [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'migration_context' on Instance uuid 4e80cef6-fc6e-4f0f-b3a8-68f17d680983 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.662 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.663 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Ensure instance console log exists: /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.663 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.663 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.664 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:41.949 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:33:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:41.950 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:33:41 compute-0 nova_compute[183278]: 2026-01-21 18:33:41.951 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:42 compute-0 nova_compute[183278]: 2026-01-21 18:33:42.010 183284 DEBUG nova.network.neutron [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Successfully created port: 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:33:42 compute-0 nova_compute[183278]: 2026-01-21 18:33:42.763 183284 DEBUG nova.network.neutron [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Successfully updated port: 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:33:42 compute-0 nova_compute[183278]: 2026-01-21 18:33:42.775 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "refresh_cache-4e80cef6-fc6e-4f0f-b3a8-68f17d680983" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:33:42 compute-0 nova_compute[183278]: 2026-01-21 18:33:42.776 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquired lock "refresh_cache-4e80cef6-fc6e-4f0f-b3a8-68f17d680983" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:33:42 compute-0 nova_compute[183278]: 2026-01-21 18:33:42.776 183284 DEBUG nova.network.neutron [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:33:42 compute-0 nova_compute[183278]: 2026-01-21 18:33:42.849 183284 DEBUG nova.compute.manager [req-b56363c1-ef3b-4bf5-aaf6-973b9b5f76a9 req-3c78b2ab-4096-4cd9-ab3f-220af742eb6e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-changed-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:33:42 compute-0 nova_compute[183278]: 2026-01-21 18:33:42.849 183284 DEBUG nova.compute.manager [req-b56363c1-ef3b-4bf5-aaf6-973b9b5f76a9 req-3c78b2ab-4096-4cd9-ab3f-220af742eb6e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Refreshing instance network info cache due to event network-changed-6964928b-8d3f-4817-a8c6-b2f4fc29ef45. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:33:42 compute-0 nova_compute[183278]: 2026-01-21 18:33:42.850 183284 DEBUG oslo_concurrency.lockutils [req-b56363c1-ef3b-4bf5-aaf6-973b9b5f76a9 req-3c78b2ab-4096-4cd9-ab3f-220af742eb6e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-4e80cef6-fc6e-4f0f-b3a8-68f17d680983" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:33:42 compute-0 nova_compute[183278]: 2026-01-21 18:33:42.892 183284 DEBUG nova.network.neutron [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.404 183284 DEBUG nova.network.neutron [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Updating instance_info_cache with network_info: [{"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.425 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Releasing lock "refresh_cache-4e80cef6-fc6e-4f0f-b3a8-68f17d680983" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.425 183284 DEBUG nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Instance network_info: |[{"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.426 183284 DEBUG oslo_concurrency.lockutils [req-b56363c1-ef3b-4bf5-aaf6-973b9b5f76a9 req-3c78b2ab-4096-4cd9-ab3f-220af742eb6e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-4e80cef6-fc6e-4f0f-b3a8-68f17d680983" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.426 183284 DEBUG nova.network.neutron [req-b56363c1-ef3b-4bf5-aaf6-973b9b5f76a9 req-3c78b2ab-4096-4cd9-ab3f-220af742eb6e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Refreshing network info cache for port 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.429 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Start _get_guest_xml network_info=[{"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.433 183284 WARNING nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.439 183284 DEBUG nova.virt.libvirt.host [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.439 183284 DEBUG nova.virt.libvirt.host [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.445 183284 DEBUG nova.virt.libvirt.host [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.445 183284 DEBUG nova.virt.libvirt.host [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.446 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.447 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.447 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.447 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.448 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.448 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.448 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.448 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.448 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.449 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.449 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.449 183284 DEBUG nova.virt.hardware [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.452 183284 DEBUG nova.virt.libvirt.vif [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-869424969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-869424969',id=22,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-wnk1toif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:33:41Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=4e80cef6-fc6e-4f0f-b3a8-68f17d680983,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.453 183284 DEBUG nova.network.os_vif_util [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.453 183284 DEBUG nova.network.os_vif_util [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ff:50,bridge_name='br-int',has_traffic_filtering=True,id=6964928b-8d3f-4817-a8c6-b2f4fc29ef45,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6964928b-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.454 183284 DEBUG nova.objects.instance [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e80cef6-fc6e-4f0f-b3a8-68f17d680983 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.471 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <uuid>4e80cef6-fc6e-4f0f-b3a8-68f17d680983</uuid>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <name>instance-00000016</name>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteStrategies-server-869424969</nova:name>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:33:43</nova:creationTime>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:33:43 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:33:43 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:33:43 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:33:43 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:33:43 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:33:43 compute-0 nova_compute[183278]:         <nova:user uuid="41dc6e790bc54fbfaf5c6007d3fa5f63">tempest-TestExecuteStrategies-1753607426-project-member</nova:user>
Jan 21 18:33:43 compute-0 nova_compute[183278]:         <nova:project uuid="fe688847145f4dee992c72dd40bbc1ac">tempest-TestExecuteStrategies-1753607426</nova:project>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:33:43 compute-0 nova_compute[183278]:         <nova:port uuid="6964928b-8d3f-4817-a8c6-b2f4fc29ef45">
Jan 21 18:33:43 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <system>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <entry name="serial">4e80cef6-fc6e-4f0f-b3a8-68f17d680983</entry>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <entry name="uuid">4e80cef6-fc6e-4f0f-b3a8-68f17d680983</entry>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     </system>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <os>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   </os>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <features>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   </features>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk.config"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:e5:ff:50"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <target dev="tap6964928b-8d"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/console.log" append="off"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <video>
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     </video>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:33:43 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:33:43 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:33:43 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:33:43 compute-0 nova_compute[183278]: </domain>
Jan 21 18:33:43 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.472 183284 DEBUG nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Preparing to wait for external event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.472 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.473 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.473 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.473 183284 DEBUG nova.virt.libvirt.vif [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-869424969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-869424969',id=22,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-wnk1toif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:33:41Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=4e80cef6-fc6e-4f0f-b3a8-68f17d680983,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.474 183284 DEBUG nova.network.os_vif_util [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.474 183284 DEBUG nova.network.os_vif_util [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ff:50,bridge_name='br-int',has_traffic_filtering=True,id=6964928b-8d3f-4817-a8c6-b2f4fc29ef45,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6964928b-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.475 183284 DEBUG os_vif [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ff:50,bridge_name='br-int',has_traffic_filtering=True,id=6964928b-8d3f-4817-a8c6-b2f4fc29ef45,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6964928b-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.475 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.475 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.476 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.478 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.478 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6964928b-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.479 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6964928b-8d, col_values=(('external_ids', {'iface-id': '6964928b-8d3f-4817-a8c6-b2f4fc29ef45', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:ff:50', 'vm-uuid': '4e80cef6-fc6e-4f0f-b3a8-68f17d680983'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.480 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:43 compute-0 NetworkManager[55506]: <info>  [1769020423.4813] manager: (tap6964928b-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.483 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.486 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.486 183284 INFO os_vif [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ff:50,bridge_name='br-int',has_traffic_filtering=True,id=6964928b-8d3f-4817-a8c6-b2f4fc29ef45,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6964928b-8d')
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.527 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.527 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.527 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No VIF found with MAC fa:16:3e:e5:ff:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.528 183284 INFO nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Using config drive
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.866 183284 INFO nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Creating config drive at /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk.config
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.872 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnr3ik0x2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:33:43 compute-0 nova_compute[183278]: 2026-01-21 18:33:43.994 183284 DEBUG oslo_concurrency.processutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnr3ik0x2" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:33:44 compute-0 kernel: tap6964928b-8d: entered promiscuous mode
Jan 21 18:33:44 compute-0 NetworkManager[55506]: <info>  [1769020424.0516] manager: (tap6964928b-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Jan 21 18:33:44 compute-0 ovn_controller[95419]: 2026-01-21T18:33:44Z|00156|binding|INFO|Claiming lport 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 for this chassis.
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.051 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:44 compute-0 ovn_controller[95419]: 2026-01-21T18:33:44Z|00157|binding|INFO|6964928b-8d3f-4817-a8c6-b2f4fc29ef45: Claiming fa:16:3e:e5:ff:50 10.100.0.3
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.059 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:ff:50 10.100.0.3'], port_security=['fa:16:3e:e5:ff:50 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4e80cef6-fc6e-4f0f-b3a8-68f17d680983', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=6964928b-8d3f-4817-a8c6-b2f4fc29ef45) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.060 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf bound to our chassis
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.061 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:33:44 compute-0 ovn_controller[95419]: 2026-01-21T18:33:44Z|00158|binding|INFO|Setting lport 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 ovn-installed in OVS
Jan 21 18:33:44 compute-0 ovn_controller[95419]: 2026-01-21T18:33:44Z|00159|binding|INFO|Setting lport 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 up in Southbound
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.068 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.074 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7424f4ee-2c67-44ec-b7f9-29258c1fe45d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.074 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap405ec01b-71 in ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:33:44 compute-0 systemd-udevd[210690]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.076 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap405ec01b-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.077 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[41065827-b70d-4f95-a1bb-0762c6067e85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.078 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f1b1736e-8968-4fdf-88db-3f9538cf9577]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 systemd-machined[154592]: New machine qemu-15-instance-00000016.
Jan 21 18:33:44 compute-0 NetworkManager[55506]: <info>  [1769020424.0890] device (tap6964928b-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:33:44 compute-0 NetworkManager[55506]: <info>  [1769020424.0902] device (tap6964928b-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.090 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb57a21-f8d8-48dc-8857-a658b6567b5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000016.
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.105 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[454cc8c3-d1e6-466d-a403-d8a2c31cbe7d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.131 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[a78fe3f4-e855-49b3-92aa-143a01fa4a72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.139 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3dd3a6-2720-4446-8bfb-e6a73d7e69de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 systemd-udevd[210695]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:33:44 compute-0 NetworkManager[55506]: <info>  [1769020424.1403] manager: (tap405ec01b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.170 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[55c6b10e-1318-43ae-9d2a-4eced0e5870c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.172 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[2d29f620-834d-41f2-96cb-96c069f391a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 NetworkManager[55506]: <info>  [1769020424.1946] device (tap405ec01b-70): carrier: link connected
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.202 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[627597bb-fd63-416c-9955-d268d08bcad0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.219 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[1416801e-d4bf-424f-bdb8-c8d5b76f98fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498780, 'reachable_time': 43586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210724, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.234 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bdf077-ed50-41fb-a576-e307039c6b74]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:9502'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498780, 'tstamp': 498780}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210726, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.252 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf4b223-1ebf-4b2d-bc9b-a8cb257a70a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498780, 'reachable_time': 43586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210728, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.291 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[31fdf336-2c53-465c-a8b2-32b876543011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.340 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020424.3393588, 4e80cef6-fc6e-4f0f-b3a8-68f17d680983 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.340 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] VM Started (Lifecycle Event)
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.350 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[61def8e0-138a-4006-bfec-f81109d1a519]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.351 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.351 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.352 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.353 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:44 compute-0 NetworkManager[55506]: <info>  [1769020424.3545] manager: (tap405ec01b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 21 18:33:44 compute-0 kernel: tap405ec01b-70: entered promiscuous mode
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.356 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.357 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.358 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:44 compute-0 ovn_controller[95419]: 2026-01-21T18:33:44Z|00160|binding|INFO|Releasing lport 9c897ad2-8ce5-4903-8c83-1ed8f117dcdd from this chassis (sb_readonly=0)
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.370 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.370 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.371 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b063f105-f32e-4fdf-8167-6656d6c4b391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.372 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:33:44 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:44.373 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'env', 'PROCESS_TAG=haproxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.515 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.519 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020424.3398972, 4e80cef6-fc6e-4f0f-b3a8-68f17d680983 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.519 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] VM Paused (Lifecycle Event)
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.714 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.718 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.752 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:33:44 compute-0 podman[210765]: 2026-01-21 18:33:44.760666631 +0000 UTC m=+0.087979286 container create 1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 21 18:33:44 compute-0 podman[210765]: 2026-01-21 18:33:44.696391459 +0000 UTC m=+0.023704134 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:33:44 compute-0 systemd[1]: Started libpod-conmon-1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf.scope.
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.884 183284 DEBUG nova.network.neutron [req-b56363c1-ef3b-4bf5-aaf6-973b9b5f76a9 req-3c78b2ab-4096-4cd9-ab3f-220af742eb6e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Updated VIF entry in instance network info cache for port 6964928b-8d3f-4817-a8c6-b2f4fc29ef45. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:33:44 compute-0 nova_compute[183278]: 2026-01-21 18:33:44.886 183284 DEBUG nova.network.neutron [req-b56363c1-ef3b-4bf5-aaf6-973b9b5f76a9 req-3c78b2ab-4096-4cd9-ab3f-220af742eb6e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Updating instance_info_cache with network_info: [{"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:33:44 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:33:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e45a797b97a3ddaedcb15f037190b7a5e948b0e53176d31957f78b43f3d147a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.024 183284 DEBUG nova.compute.manager [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.025 183284 DEBUG oslo_concurrency.lockutils [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.025 183284 DEBUG oslo_concurrency.lockutils [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.025 183284 DEBUG oslo_concurrency.lockutils [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.026 183284 DEBUG nova.compute.manager [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Processing event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.026 183284 DEBUG nova.compute.manager [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.026 183284 DEBUG oslo_concurrency.lockutils [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.026 183284 DEBUG oslo_concurrency.lockutils [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.027 183284 DEBUG oslo_concurrency.lockutils [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.027 183284 DEBUG nova.compute.manager [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] No waiting events found dispatching network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.027 183284 WARNING nova.compute.manager [req-063978a5-776d-4d1b-8197-ae797fe17f25 req-eaeef008-6091-4f90-89fa-067093837ffa 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received unexpected event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 for instance with vm_state building and task_state spawning.
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.028 183284 DEBUG nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.029 183284 DEBUG oslo_concurrency.lockutils [req-b56363c1-ef3b-4bf5-aaf6-973b9b5f76a9 req-3c78b2ab-4096-4cd9-ab3f-220af742eb6e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-4e80cef6-fc6e-4f0f-b3a8-68f17d680983" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.032 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020425.0317636, 4e80cef6-fc6e-4f0f-b3a8-68f17d680983 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.032 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] VM Resumed (Lifecycle Event)
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.035 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.038 183284 INFO nova.virt.libvirt.driver [-] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Instance spawned successfully.
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.039 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:33:45 compute-0 podman[210765]: 2026-01-21 18:33:45.059020628 +0000 UTC m=+0.386333283 container init 1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:33:45 compute-0 podman[210765]: 2026-01-21 18:33:45.065464724 +0000 UTC m=+0.392777369 container start 1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.077 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.081 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.081 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.082 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.082 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.083 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.083 183284 DEBUG nova.virt.libvirt.driver [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:33:45 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210780]: [NOTICE]   (210784) : New worker (210786) forked
Jan 21 18:33:45 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210780]: [NOTICE]   (210784) : Loading success.
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.088 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.193 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.246 183284 INFO nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Took 3.92 seconds to spawn the instance on the hypervisor.
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.247 183284 DEBUG nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.324 183284 INFO nova.compute.manager [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Took 4.36 seconds to build instance.
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.339 183284 DEBUG oslo_concurrency.lockutils [None req-4744edef-ade7-488a-b07a-8976c32e0ea2 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:45 compute-0 nova_compute[183278]: 2026-01-21 18:33:45.939 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:46 compute-0 podman[210796]: 2026-01-21 18:33:46.002447696 +0000 UTC m=+0.051054245 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 21 18:33:46 compute-0 podman[210795]: 2026-01-21 18:33:46.034310285 +0000 UTC m=+0.085862555 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 18:33:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:47.953 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:33:48 compute-0 nova_compute[183278]: 2026-01-21 18:33:48.482 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:50 compute-0 nova_compute[183278]: 2026-01-21 18:33:50.940 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:51 compute-0 nova_compute[183278]: 2026-01-21 18:33:51.022 183284 DEBUG nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Check if temp file /var/lib/nova/instances/tmp10vlh4u2 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 18:33:51 compute-0 nova_compute[183278]: 2026-01-21 18:33:51.022 183284 DEBUG nova.compute.manager [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp10vlh4u2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4e80cef6-fc6e-4f0f-b3a8-68f17d680983',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 18:33:51 compute-0 podman[210837]: 2026-01-21 18:33:51.024420218 +0000 UTC m=+0.078004855 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:33:51 compute-0 nova_compute[183278]: 2026-01-21 18:33:51.432 183284 DEBUG oslo_concurrency.processutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:33:51 compute-0 nova_compute[183278]: 2026-01-21 18:33:51.492 183284 DEBUG oslo_concurrency.processutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:33:51 compute-0 nova_compute[183278]: 2026-01-21 18:33:51.494 183284 DEBUG oslo_concurrency.processutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:33:51 compute-0 nova_compute[183278]: 2026-01-21 18:33:51.552 183284 DEBUG oslo_concurrency.processutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:33:53 compute-0 nova_compute[183278]: 2026-01-21 18:33:53.487 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:54 compute-0 sshd-session[210868]: Accepted publickey for nova from 192.168.122.101 port 37254 ssh2: ECDSA SHA256:29a5JNhHHz2bb0ACqZTr6qOKeSRnhiTRA8SK+rzn9gs
Jan 21 18:33:54 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:33:54 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:33:54 compute-0 systemd-logind[782]: New session 38 of user nova.
Jan 21 18:33:54 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:33:54 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:33:54 compute-0 systemd[210872]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:33:54 compute-0 systemd[210872]: Queued start job for default target Main User Target.
Jan 21 18:33:54 compute-0 systemd[210872]: Created slice User Application Slice.
Jan 21 18:33:54 compute-0 systemd[210872]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:33:54 compute-0 systemd[210872]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:33:54 compute-0 systemd[210872]: Reached target Paths.
Jan 21 18:33:54 compute-0 systemd[210872]: Reached target Timers.
Jan 21 18:33:54 compute-0 systemd[210872]: Starting D-Bus User Message Bus Socket...
Jan 21 18:33:54 compute-0 systemd[210872]: Starting Create User's Volatile Files and Directories...
Jan 21 18:33:54 compute-0 systemd[210872]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:33:54 compute-0 systemd[210872]: Finished Create User's Volatile Files and Directories.
Jan 21 18:33:54 compute-0 systemd[210872]: Reached target Sockets.
Jan 21 18:33:54 compute-0 systemd[210872]: Reached target Basic System.
Jan 21 18:33:54 compute-0 systemd[210872]: Reached target Main User Target.
Jan 21 18:33:54 compute-0 systemd[210872]: Startup finished in 135ms.
Jan 21 18:33:54 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:33:54 compute-0 systemd[1]: Started Session 38 of User nova.
Jan 21 18:33:54 compute-0 sshd-session[210868]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:33:54 compute-0 sshd-session[210887]: Received disconnect from 192.168.122.101 port 37254:11: disconnected by user
Jan 21 18:33:54 compute-0 sshd-session[210887]: Disconnected from user nova 192.168.122.101 port 37254
Jan 21 18:33:54 compute-0 sshd-session[210868]: pam_unix(sshd:session): session closed for user nova
Jan 21 18:33:54 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Jan 21 18:33:54 compute-0 systemd-logind[782]: Session 38 logged out. Waiting for processes to exit.
Jan 21 18:33:54 compute-0 systemd-logind[782]: Removed session 38.
Jan 21 18:33:54 compute-0 nova_compute[183278]: 2026-01-21 18:33:54.950 183284 DEBUG nova.compute.manager [req-abd72bdd-dfcc-4242-b606-227089c5e697 req-fdbb79a1-256a-49ff-a804-6ef92d118ea8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-unplugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:33:54 compute-0 nova_compute[183278]: 2026-01-21 18:33:54.952 183284 DEBUG oslo_concurrency.lockutils [req-abd72bdd-dfcc-4242-b606-227089c5e697 req-fdbb79a1-256a-49ff-a804-6ef92d118ea8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:54 compute-0 nova_compute[183278]: 2026-01-21 18:33:54.952 183284 DEBUG oslo_concurrency.lockutils [req-abd72bdd-dfcc-4242-b606-227089c5e697 req-fdbb79a1-256a-49ff-a804-6ef92d118ea8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:54 compute-0 nova_compute[183278]: 2026-01-21 18:33:54.953 183284 DEBUG oslo_concurrency.lockutils [req-abd72bdd-dfcc-4242-b606-227089c5e697 req-fdbb79a1-256a-49ff-a804-6ef92d118ea8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:54 compute-0 nova_compute[183278]: 2026-01-21 18:33:54.953 183284 DEBUG nova.compute.manager [req-abd72bdd-dfcc-4242-b606-227089c5e697 req-fdbb79a1-256a-49ff-a804-6ef92d118ea8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] No waiting events found dispatching network-vif-unplugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:33:54 compute-0 nova_compute[183278]: 2026-01-21 18:33:54.953 183284 DEBUG nova.compute.manager [req-abd72bdd-dfcc-4242-b606-227089c5e697 req-fdbb79a1-256a-49ff-a804-6ef92d118ea8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-unplugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.809 183284 INFO nova.compute.manager [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Took 4.26 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.810 183284 DEBUG nova.compute.manager [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.830 183284 DEBUG nova.compute.manager [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp10vlh4u2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4e80cef6-fc6e-4f0f-b3a8-68f17d680983',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1d90ce88-b17f-4bb7-a1aa-c659a114eeca),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.852 183284 DEBUG nova.objects.instance [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e80cef6-fc6e-4f0f-b3a8-68f17d680983 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.853 183284 DEBUG nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.855 183284 DEBUG nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.855 183284 DEBUG nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.872 183284 DEBUG nova.virt.libvirt.vif [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-869424969',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-869424969',id=22,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:33:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-wnk1toif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:33:45Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=4e80cef6-fc6e-4f0f-b3a8-68f17d680983,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.872 183284 DEBUG nova.network.os_vif_util [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Converting VIF {"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.873 183284 DEBUG nova.network.os_vif_util [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ff:50,bridge_name='br-int',has_traffic_filtering=True,id=6964928b-8d3f-4817-a8c6-b2f4fc29ef45,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6964928b-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.874 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 18:33:55 compute-0 nova_compute[183278]:   <mac address="fa:16:3e:e5:ff:50"/>
Jan 21 18:33:55 compute-0 nova_compute[183278]:   <model type="virtio"/>
Jan 21 18:33:55 compute-0 nova_compute[183278]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:33:55 compute-0 nova_compute[183278]:   <mtu size="1442"/>
Jan 21 18:33:55 compute-0 nova_compute[183278]:   <target dev="tap6964928b-8d"/>
Jan 21 18:33:55 compute-0 nova_compute[183278]: </interface>
Jan 21 18:33:55 compute-0 nova_compute[183278]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.874 183284 DEBUG nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 18:33:55 compute-0 nova_compute[183278]: 2026-01-21 18:33:55.942 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:56 compute-0 nova_compute[183278]: 2026-01-21 18:33:56.358 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:33:56 compute-0 nova_compute[183278]: 2026-01-21 18:33:56.358 183284 INFO nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 18:33:56 compute-0 nova_compute[183278]: 2026-01-21 18:33:56.413 183284 INFO nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 18:33:56 compute-0 nova_compute[183278]: 2026-01-21 18:33:56.915 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:33:56 compute-0 nova_compute[183278]: 2026-01-21 18:33:56.916 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.015 183284 DEBUG nova.compute.manager [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.015 183284 DEBUG oslo_concurrency.lockutils [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.016 183284 DEBUG oslo_concurrency.lockutils [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.016 183284 DEBUG oslo_concurrency.lockutils [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.016 183284 DEBUG nova.compute.manager [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] No waiting events found dispatching network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.016 183284 WARNING nova.compute.manager [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received unexpected event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 for instance with vm_state active and task_state migrating.
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.017 183284 DEBUG nova.compute.manager [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-changed-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.017 183284 DEBUG nova.compute.manager [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Refreshing instance network info cache due to event network-changed-6964928b-8d3f-4817-a8c6-b2f4fc29ef45. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.017 183284 DEBUG oslo_concurrency.lockutils [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-4e80cef6-fc6e-4f0f-b3a8-68f17d680983" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.017 183284 DEBUG oslo_concurrency.lockutils [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-4e80cef6-fc6e-4f0f-b3a8-68f17d680983" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.018 183284 DEBUG nova.network.neutron [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Refreshing network info cache for port 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.418 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.419 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.922 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:33:57 compute-0 nova_compute[183278]: 2026-01-21 18:33:57.923 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.127 183284 DEBUG nova.network.neutron [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Updated VIF entry in instance network info cache for port 6964928b-8d3f-4817-a8c6-b2f4fc29ef45. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.128 183284 DEBUG nova.network.neutron [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Updating instance_info_cache with network_info: [{"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.150 183284 DEBUG oslo_concurrency.lockutils [req-43b9b0b6-6ec4-4c95-a353-247f06d31368 req-e2b13554-51d2-4bd2-ab0e-89429d1ecec9 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-4e80cef6-fc6e-4f0f-b3a8-68f17d680983" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:33:58 compute-0 ovn_controller[95419]: 2026-01-21T18:33:58Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:ff:50 10.100.0.3
Jan 21 18:33:58 compute-0 ovn_controller[95419]: 2026-01-21T18:33:58Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:ff:50 10.100.0.3
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.427 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.427 183284 DEBUG nova.virt.libvirt.migration [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.468 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020438.4679146, 4e80cef6-fc6e-4f0f-b3a8-68f17d680983 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.468 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] VM Paused (Lifecycle Event)
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.492 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.613 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.617 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.640 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 18:33:58 compute-0 kernel: tap6964928b-8d (unregistering): left promiscuous mode
Jan 21 18:33:58 compute-0 NetworkManager[55506]: <info>  [1769020438.6523] device (tap6964928b-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:33:58 compute-0 ovn_controller[95419]: 2026-01-21T18:33:58Z|00161|binding|INFO|Releasing lport 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 from this chassis (sb_readonly=0)
Jan 21 18:33:58 compute-0 ovn_controller[95419]: 2026-01-21T18:33:58Z|00162|binding|INFO|Setting lport 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 down in Southbound
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.660 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:58 compute-0 ovn_controller[95419]: 2026-01-21T18:33:58Z|00163|binding|INFO|Removing iface tap6964928b-8d ovn-installed in OVS
Jan 21 18:33:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:58.668 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:ff:50 10.100.0.3'], port_security=['fa:16:3e:e5:ff:50 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '88a62794-b4a4-47e3-9cce-91e574e684c1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '4e80cef6-fc6e-4f0f-b3a8-68f17d680983', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '8', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=6964928b-8d3f-4817-a8c6-b2f4fc29ef45) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:33:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:58.670 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf unbound from our chassis
Jan 21 18:33:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:58.671 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:33:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:58.672 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[826d0746-8065-4173-b151-179e7e6da285]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:58 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:58.673 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace which is not needed anymore
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.680 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:58 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 21 18:33:58 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000016.scope: Consumed 13.353s CPU time.
Jan 21 18:33:58 compute-0 systemd-machined[154592]: Machine qemu-15-instance-00000016 terminated.
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.862 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:58 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210780]: [NOTICE]   (210784) : haproxy version is 2.8.14-c23fe91
Jan 21 18:33:58 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210780]: [NOTICE]   (210784) : path to executable is /usr/sbin/haproxy
Jan 21 18:33:58 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210780]: [WARNING]  (210784) : Exiting Master process...
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.900 183284 DEBUG nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.900 183284 DEBUG nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.900 183284 DEBUG nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 18:33:58 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210780]: [ALERT]    (210784) : Current worker (210786) exited with code 143 (Terminated)
Jan 21 18:33:58 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[210780]: [WARNING]  (210784) : All workers exited. Exiting... (0)
Jan 21 18:33:58 compute-0 systemd[1]: libpod-1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf.scope: Deactivated successfully.
Jan 21 18:33:58 compute-0 podman[210934]: 2026-01-21 18:33:58.910886841 +0000 UTC m=+0.139979932 container died 1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.929 183284 DEBUG nova.virt.libvirt.guest [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '4e80cef6-fc6e-4f0f-b3a8-68f17d680983' (instance-00000016) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.930 183284 INFO nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Migration operation has completed
Jan 21 18:33:58 compute-0 nova_compute[183278]: 2026-01-21 18:33:58.930 183284 INFO nova.compute.manager [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] _post_live_migration() is started..
Jan 21 18:33:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf-userdata-shm.mount: Deactivated successfully.
Jan 21 18:33:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-e45a797b97a3ddaedcb15f037190b7a5e948b0e53176d31957f78b43f3d147a6-merged.mount: Deactivated successfully.
Jan 21 18:33:59 compute-0 podman[210934]: 2026-01-21 18:33:59.298127745 +0000 UTC m=+0.527220836 container cleanup 1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 18:33:59 compute-0 systemd[1]: libpod-conmon-1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf.scope: Deactivated successfully.
Jan 21 18:33:59 compute-0 podman[210981]: 2026-01-21 18:33:59.476446941 +0000 UTC m=+0.155804104 container remove 1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 18:33:59 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:59.482 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f3313fe5-7700-4e53-abba-32a5218b7083]: (4, ('Wed Jan 21 06:33:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf)\n1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf\nWed Jan 21 06:33:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf)\n1d51063bb9fd01bfb38a110f30b0d2a1c26ffef37a3ac177f042b802e919afdf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:59 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:59.484 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7390bab5-2a11-44f5-9dca-f78b0a05074e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:59 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:59.485 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:33:59 compute-0 kernel: tap405ec01b-70: left promiscuous mode
Jan 21 18:33:59 compute-0 nova_compute[183278]: 2026-01-21 18:33:59.488 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:59 compute-0 nova_compute[183278]: 2026-01-21 18:33:59.504 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:33:59 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:59.510 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7d12a9a2-067a-433c-ad82-c26905973cb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:59 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:59.530 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[89cd2e66-36e2-484a-961c-06140f66fd57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:59 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:59.531 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[21e40192-03f8-4a19-98d6-85589677ba3d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:59 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:59.545 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5712304a-d6f5-4128-b7df-0ced0959b292]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498773, 'reachable_time': 41941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211000, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:59 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:59.548 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:33:59 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:33:59.548 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9dadf1-79e4-4441-af23-3db54d0ace0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:33:59 compute-0 systemd[1]: run-netns-ovnmeta\x2d405ec01b\x2d76d3\x2d4c3c\x2da31b\x2d5f16d9641fbf.mount: Deactivated successfully.
Jan 21 18:33:59 compute-0 podman[192560]: time="2026-01-21T18:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:33:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:33:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.844 183284 DEBUG nova.compute.manager [req-91ca6f96-1c70-4511-9229-16dc8cc80265 req-58016bb0-da2f-4f83-a3d0-59811636242a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-unplugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.844 183284 DEBUG oslo_concurrency.lockutils [req-91ca6f96-1c70-4511-9229-16dc8cc80265 req-58016bb0-da2f-4f83-a3d0-59811636242a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.845 183284 DEBUG oslo_concurrency.lockutils [req-91ca6f96-1c70-4511-9229-16dc8cc80265 req-58016bb0-da2f-4f83-a3d0-59811636242a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.845 183284 DEBUG oslo_concurrency.lockutils [req-91ca6f96-1c70-4511-9229-16dc8cc80265 req-58016bb0-da2f-4f83-a3d0-59811636242a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.845 183284 DEBUG nova.compute.manager [req-91ca6f96-1c70-4511-9229-16dc8cc80265 req-58016bb0-da2f-4f83-a3d0-59811636242a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] No waiting events found dispatching network-vif-unplugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.845 183284 DEBUG nova.compute.manager [req-91ca6f96-1c70-4511-9229-16dc8cc80265 req-58016bb0-da2f-4f83-a3d0-59811636242a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-unplugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.928 183284 DEBUG nova.network.neutron [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Activated binding for port 6964928b-8d3f-4817-a8c6-b2f4fc29ef45 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.929 183284 DEBUG nova.compute.manager [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.930 183284 DEBUG nova.virt.libvirt.vif [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:33:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-869424969',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-869424969',id=22,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:33:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-wnk1toif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:33:48Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=4e80cef6-fc6e-4f0f-b3a8-68f17d680983,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.930 183284 DEBUG nova.network.os_vif_util [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Converting VIF {"id": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "address": "fa:16:3e:e5:ff:50", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6964928b-8d", "ovs_interfaceid": "6964928b-8d3f-4817-a8c6-b2f4fc29ef45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.930 183284 DEBUG nova.network.os_vif_util [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ff:50,bridge_name='br-int',has_traffic_filtering=True,id=6964928b-8d3f-4817-a8c6-b2f4fc29ef45,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6964928b-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.931 183284 DEBUG os_vif [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ff:50,bridge_name='br-int',has_traffic_filtering=True,id=6964928b-8d3f-4817-a8c6-b2f4fc29ef45,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6964928b-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.932 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.932 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6964928b-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.933 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.935 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.937 183284 INFO os_vif [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ff:50,bridge_name='br-int',has_traffic_filtering=True,id=6964928b-8d3f-4817-a8c6-b2f4fc29ef45,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6964928b-8d')
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.937 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.937 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.938 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.938 183284 DEBUG nova.compute.manager [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.938 183284 INFO nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Deleting instance files /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983_del
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.939 183284 INFO nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Deletion of /var/lib/nova/instances/4e80cef6-fc6e-4f0f-b3a8-68f17d680983_del complete
Jan 21 18:34:00 compute-0 nova_compute[183278]: 2026-01-21 18:34:00.943 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:01 compute-0 openstack_network_exporter[195402]: ERROR   18:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:34:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:34:01 compute-0 openstack_network_exporter[195402]: ERROR   18:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:34:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:34:01 compute-0 nova_compute[183278]: 2026-01-21 18:34:01.452 183284 DEBUG nova.compute.manager [req-e8b32ef0-ea87-455a-8151-fca38ead7d3e req-266880fd-bb83-4da4-b932-2fada378d96b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:34:01 compute-0 nova_compute[183278]: 2026-01-21 18:34:01.452 183284 DEBUG oslo_concurrency.lockutils [req-e8b32ef0-ea87-455a-8151-fca38ead7d3e req-266880fd-bb83-4da4-b932-2fada378d96b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:01 compute-0 nova_compute[183278]: 2026-01-21 18:34:01.453 183284 DEBUG oslo_concurrency.lockutils [req-e8b32ef0-ea87-455a-8151-fca38ead7d3e req-266880fd-bb83-4da4-b932-2fada378d96b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:01 compute-0 nova_compute[183278]: 2026-01-21 18:34:01.453 183284 DEBUG oslo_concurrency.lockutils [req-e8b32ef0-ea87-455a-8151-fca38ead7d3e req-266880fd-bb83-4da4-b932-2fada378d96b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:01 compute-0 nova_compute[183278]: 2026-01-21 18:34:01.453 183284 DEBUG nova.compute.manager [req-e8b32ef0-ea87-455a-8151-fca38ead7d3e req-266880fd-bb83-4da4-b932-2fada378d96b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] No waiting events found dispatching network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:34:01 compute-0 nova_compute[183278]: 2026-01-21 18:34:01.453 183284 WARNING nova.compute.manager [req-e8b32ef0-ea87-455a-8151-fca38ead7d3e req-266880fd-bb83-4da4-b932-2fada378d96b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received unexpected event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 for instance with vm_state active and task_state migrating.
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.545 183284 DEBUG nova.compute.manager [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.545 183284 DEBUG oslo_concurrency.lockutils [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.545 183284 DEBUG oslo_concurrency.lockutils [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.545 183284 DEBUG oslo_concurrency.lockutils [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.546 183284 DEBUG nova.compute.manager [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] No waiting events found dispatching network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.546 183284 WARNING nova.compute.manager [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received unexpected event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 for instance with vm_state active and task_state migrating.
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.546 183284 DEBUG nova.compute.manager [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.546 183284 DEBUG oslo_concurrency.lockutils [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.546 183284 DEBUG oslo_concurrency.lockutils [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.547 183284 DEBUG oslo_concurrency.lockutils [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.547 183284 DEBUG nova.compute.manager [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] No waiting events found dispatching network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:34:03 compute-0 nova_compute[183278]: 2026-01-21 18:34:03.547 183284 WARNING nova.compute.manager [req-b312bd96-50d1-4d41-9044-836e7b521a6c req-1deff475-a460-47a8-908e-ed79c16c6532 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Received unexpected event network-vif-plugged-6964928b-8d3f-4817-a8c6-b2f4fc29ef45 for instance with vm_state active and task_state migrating.
Jan 21 18:34:04 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:34:04 compute-0 systemd[210872]: Activating special unit Exit the Session...
Jan 21 18:34:04 compute-0 systemd[210872]: Stopped target Main User Target.
Jan 21 18:34:04 compute-0 systemd[210872]: Stopped target Basic System.
Jan 21 18:34:04 compute-0 systemd[210872]: Stopped target Paths.
Jan 21 18:34:04 compute-0 systemd[210872]: Stopped target Sockets.
Jan 21 18:34:04 compute-0 systemd[210872]: Stopped target Timers.
Jan 21 18:34:04 compute-0 systemd[210872]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:34:04 compute-0 systemd[210872]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:34:04 compute-0 systemd[210872]: Closed D-Bus User Message Bus Socket.
Jan 21 18:34:04 compute-0 systemd[210872]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:34:04 compute-0 systemd[210872]: Removed slice User Application Slice.
Jan 21 18:34:04 compute-0 systemd[210872]: Reached target Shutdown.
Jan 21 18:34:04 compute-0 systemd[210872]: Finished Exit the Session.
Jan 21 18:34:04 compute-0 systemd[210872]: Reached target Exit the Session.
Jan 21 18:34:04 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:34:04 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:34:04 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:34:04 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:34:04 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:34:04 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:34:04 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:34:05 compute-0 nova_compute[183278]: 2026-01-21 18:34:05.935 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:05 compute-0 nova_compute[183278]: 2026-01-21 18:34:05.944 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.430 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Acquiring lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.430 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.430 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "4e80cef6-fc6e-4f0f-b3a8-68f17d680983-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.449 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.449 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.449 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.449 183284 DEBUG nova.compute.resource_tracker [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.595 183284 WARNING nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.596 183284 DEBUG nova.compute.resource_tracker [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5755MB free_disk=73.37895965576172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.596 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.596 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.628 183284 DEBUG nova.compute.resource_tracker [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Migration for instance 4e80cef6-fc6e-4f0f-b3a8-68f17d680983 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.644 183284 DEBUG nova.compute.resource_tracker [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.672 183284 DEBUG nova.compute.resource_tracker [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Migration 1d90ce88-b17f-4bb7-a1aa-c659a114eeca is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.672 183284 DEBUG nova.compute.resource_tracker [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.673 183284 DEBUG nova.compute.resource_tracker [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.710 183284 DEBUG nova.compute.provider_tree [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.733 183284 DEBUG nova.scheduler.client.report [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.754 183284 DEBUG nova.compute.resource_tracker [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.754 183284 DEBUG oslo_concurrency.lockutils [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.761 183284 INFO nova.compute.manager [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.847 183284 INFO nova.scheduler.client.report [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Deleted allocation for migration 1d90ce88-b17f-4bb7-a1aa-c659a114eeca
Jan 21 18:34:07 compute-0 nova_compute[183278]: 2026-01-21 18:34:07.848 183284 DEBUG nova.virt.libvirt.driver [None req-629e2fa8-8b53-4fe8-b52e-1708d3188998 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 18:34:10 compute-0 nova_compute[183278]: 2026-01-21 18:34:10.946 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:34:10 compute-0 nova_compute[183278]: 2026-01-21 18:34:10.947 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:34:10 compute-0 nova_compute[183278]: 2026-01-21 18:34:10.948 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 21 18:34:10 compute-0 nova_compute[183278]: 2026-01-21 18:34:10.948 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:34:10 compute-0 nova_compute[183278]: 2026-01-21 18:34:10.990 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:10 compute-0 nova_compute[183278]: 2026-01-21 18:34:10.991 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:34:11 compute-0 podman[211003]: 2026-01-21 18:34:11.015748819 +0000 UTC m=+0.074755648 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, release=1755695350, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 21 18:34:13 compute-0 nova_compute[183278]: 2026-01-21 18:34:13.899 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020438.8977268, 4e80cef6-fc6e-4f0f-b3a8-68f17d680983 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:34:13 compute-0 nova_compute[183278]: 2026-01-21 18:34:13.899 183284 INFO nova.compute.manager [-] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] VM Stopped (Lifecycle Event)
Jan 21 18:34:13 compute-0 nova_compute[183278]: 2026-01-21 18:34:13.922 183284 DEBUG nova.compute.manager [None req-2dc841a6-7519-4da5-bb00-631b274c3250 - - - - - -] [instance: 4e80cef6-fc6e-4f0f-b3a8-68f17d680983] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:34:14 compute-0 ovn_controller[95419]: 2026-01-21T18:34:14Z|00164|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Jan 21 18:34:15 compute-0 nova_compute[183278]: 2026-01-21 18:34:15.991 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:16 compute-0 podman[211026]: 2026-01-21 18:34:16.999596835 +0000 UTC m=+0.051560506 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 21 18:34:17 compute-0 podman[211025]: 2026-01-21 18:34:17.024599809 +0000 UTC m=+0.079339517 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 18:34:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:34:20.103 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:34:20.104 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:34:20.104 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:20 compute-0 nova_compute[183278]: 2026-01-21 18:34:20.993 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:34:20 compute-0 nova_compute[183278]: 2026-01-21 18:34:20.995 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:34:20 compute-0 nova_compute[183278]: 2026-01-21 18:34:20.995 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 21 18:34:20 compute-0 nova_compute[183278]: 2026-01-21 18:34:20.995 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:34:20 compute-0 nova_compute[183278]: 2026-01-21 18:34:20.996 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:34:20 compute-0 nova_compute[183278]: 2026-01-21 18:34:20.997 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:21 compute-0 nova_compute[183278]: 2026-01-21 18:34:21.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:34:21 compute-0 nova_compute[183278]: 2026-01-21 18:34:21.819 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:34:21 compute-0 nova_compute[183278]: 2026-01-21 18:34:21.819 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:34:21 compute-0 nova_compute[183278]: 2026-01-21 18:34:21.847 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:34:21 compute-0 podman[211066]: 2026-01-21 18:34:21.985024605 +0000 UTC m=+0.044259180 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:34:24 compute-0 nova_compute[183278]: 2026-01-21 18:34:24.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:34:25 compute-0 nova_compute[183278]: 2026-01-21 18:34:25.997 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:26 compute-0 nova_compute[183278]: 2026-01-21 18:34:26.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:34:26 compute-0 nova_compute[183278]: 2026-01-21 18:34:26.841 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:26 compute-0 nova_compute[183278]: 2026-01-21 18:34:26.842 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:26 compute-0 nova_compute[183278]: 2026-01-21 18:34:26.842 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:26 compute-0 nova_compute[183278]: 2026-01-21 18:34:26.842 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:34:26 compute-0 nova_compute[183278]: 2026-01-21 18:34:26.962 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:34:26 compute-0 nova_compute[183278]: 2026-01-21 18:34:26.963 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5792MB free_disk=73.37897872924805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:34:26 compute-0 nova_compute[183278]: 2026-01-21 18:34:26.963 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:34:26 compute-0 nova_compute[183278]: 2026-01-21 18:34:26.963 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:34:27 compute-0 nova_compute[183278]: 2026-01-21 18:34:27.019 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:34:27 compute-0 nova_compute[183278]: 2026-01-21 18:34:27.019 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:34:27 compute-0 nova_compute[183278]: 2026-01-21 18:34:27.035 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:34:27 compute-0 nova_compute[183278]: 2026-01-21 18:34:27.048 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:34:27 compute-0 nova_compute[183278]: 2026-01-21 18:34:27.050 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:34:27 compute-0 nova_compute[183278]: 2026-01-21 18:34:27.050 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:34:28 compute-0 nova_compute[183278]: 2026-01-21 18:34:28.045 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:34:28 compute-0 nova_compute[183278]: 2026-01-21 18:34:28.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:34:28 compute-0 nova_compute[183278]: 2026-01-21 18:34:28.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:34:29 compute-0 podman[192560]: time="2026-01-21T18:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:34:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:34:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Jan 21 18:34:29 compute-0 nova_compute[183278]: 2026-01-21 18:34:29.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:34:30 compute-0 nova_compute[183278]: 2026-01-21 18:34:30.999 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:31 compute-0 openstack_network_exporter[195402]: ERROR   18:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:34:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:34:31 compute-0 openstack_network_exporter[195402]: ERROR   18:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:34:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:34:34 compute-0 nova_compute[183278]: 2026-01-21 18:34:34.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:34:35 compute-0 nova_compute[183278]: 2026-01-21 18:34:35.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:34:35 compute-0 nova_compute[183278]: 2026-01-21 18:34:35.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:34:36 compute-0 nova_compute[183278]: 2026-01-21 18:34:36.000 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:34:36 compute-0 nova_compute[183278]: 2026-01-21 18:34:36.001 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:36 compute-0 nova_compute[183278]: 2026-01-21 18:34:36.001 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 21 18:34:36 compute-0 nova_compute[183278]: 2026-01-21 18:34:36.001 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:34:36 compute-0 nova_compute[183278]: 2026-01-21 18:34:36.001 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:34:36 compute-0 nova_compute[183278]: 2026-01-21 18:34:36.002 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:39 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 18:34:41 compute-0 nova_compute[183278]: 2026-01-21 18:34:41.003 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:41 compute-0 podman[211094]: 2026-01-21 18:34:41.492369646 +0000 UTC m=+0.049932927 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_id=openstack_network_exporter)
Jan 21 18:34:46 compute-0 nova_compute[183278]: 2026-01-21 18:34:46.004 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:47 compute-0 podman[211116]: 2026-01-21 18:34:47.99937464 +0000 UTC m=+0.047842776 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:34:48 compute-0 podman[211115]: 2026-01-21 18:34:48.026458974 +0000 UTC m=+0.077403370 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 21 18:34:51 compute-0 nova_compute[183278]: 2026-01-21 18:34:51.006 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:34:51 compute-0 nova_compute[183278]: 2026-01-21 18:34:51.007 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:51 compute-0 nova_compute[183278]: 2026-01-21 18:34:51.007 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 21 18:34:51 compute-0 nova_compute[183278]: 2026-01-21 18:34:51.007 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:34:51 compute-0 nova_compute[183278]: 2026-01-21 18:34:51.008 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:34:51 compute-0 nova_compute[183278]: 2026-01-21 18:34:51.009 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:52 compute-0 podman[211158]: 2026-01-21 18:34:52.985548578 +0000 UTC m=+0.046092104 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:34:56 compute-0 nova_compute[183278]: 2026-01-21 18:34:56.009 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:34:59 compute-0 podman[192560]: time="2026-01-21T18:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:34:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:34:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 21 18:35:01 compute-0 nova_compute[183278]: 2026-01-21 18:35:01.011 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:35:01 compute-0 nova_compute[183278]: 2026-01-21 18:35:01.012 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:01 compute-0 nova_compute[183278]: 2026-01-21 18:35:01.012 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 21 18:35:01 compute-0 nova_compute[183278]: 2026-01-21 18:35:01.012 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:35:01 compute-0 nova_compute[183278]: 2026-01-21 18:35:01.013 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 18:35:01 compute-0 nova_compute[183278]: 2026-01-21 18:35:01.013 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:01 compute-0 nova_compute[183278]: 2026-01-21 18:35:01.097 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:01 compute-0 openstack_network_exporter[195402]: ERROR   18:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:35:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:35:01 compute-0 openstack_network_exporter[195402]: ERROR   18:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:35:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:35:02 compute-0 ovn_controller[95419]: 2026-01-21T18:35:02Z|00165|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 18:35:06 compute-0 nova_compute[183278]: 2026-01-21 18:35:06.014 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:06 compute-0 nova_compute[183278]: 2026-01-21 18:35:06.097 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:11 compute-0 nova_compute[183278]: 2026-01-21 18:35:11.017 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:11 compute-0 nova_compute[183278]: 2026-01-21 18:35:11.099 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:12 compute-0 podman[211181]: 2026-01-21 18:35:12.02637473 +0000 UTC m=+0.076370205 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, io.buildah.version=1.33.7, vcs-type=git, version=9.6)
Jan 21 18:35:16 compute-0 nova_compute[183278]: 2026-01-21 18:35:16.019 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:16 compute-0 nova_compute[183278]: 2026-01-21 18:35:16.101 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:16 compute-0 nova_compute[183278]: 2026-01-21 18:35:16.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:16 compute-0 nova_compute[183278]: 2026-01-21 18:35:16.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 18:35:18 compute-0 podman[211204]: 2026-01-21 18:35:18.99740039 +0000 UTC m=+0.051479264 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:35:19 compute-0 podman[211203]: 2026-01-21 18:35:19.049797206 +0000 UTC m=+0.109026324 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:35:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:20.104 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:20.105 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:20.105 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:21 compute-0 nova_compute[183278]: 2026-01-21 18:35:21.022 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:21 compute-0 nova_compute[183278]: 2026-01-21 18:35:21.106 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:23 compute-0 nova_compute[183278]: 2026-01-21 18:35:23.841 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:23 compute-0 nova_compute[183278]: 2026-01-21 18:35:23.841 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:35:23 compute-0 nova_compute[183278]: 2026-01-21 18:35:23.841 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:35:23 compute-0 nova_compute[183278]: 2026-01-21 18:35:23.856 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:35:23 compute-0 podman[211246]: 2026-01-21 18:35:23.989610894 +0000 UTC m=+0.048872451 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:35:24 compute-0 nova_compute[183278]: 2026-01-21 18:35:24.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:24 compute-0 nova_compute[183278]: 2026-01-21 18:35:24.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 18:35:24 compute-0 nova_compute[183278]: 2026-01-21 18:35:24.836 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.191 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.191 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.205 183284 DEBUG nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.341 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.342 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.349 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.350 183284 INFO nova.compute.claims [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.486 183284 DEBUG nova.compute.provider_tree [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.501 183284 DEBUG nova.scheduler.client.report [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.518 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.519 183284 DEBUG nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.562 183284 DEBUG nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.563 183284 DEBUG nova.network.neutron [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.592 183284 INFO nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.656 183284 DEBUG nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.838 183284 DEBUG nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.839 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.840 183284 INFO nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Creating image(s)
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.841 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "/var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.841 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.842 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "/var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.862 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.915 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.916 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.917 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:25 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.927 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:25.999 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.000 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.025 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.036 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.037 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.037 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.094 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.095 183284 DEBUG nova.virt.disk.api [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Checking if we can resize image /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.096 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.109 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.150 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.151 183284 DEBUG nova.virt.disk.api [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Cannot resize image /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.152 183284 DEBUG nova.objects.instance [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'migration_context' on Instance uuid 1b617117-9d85-42be-bfd4-9f5228156160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.176 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.176 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Ensure instance console log exists: /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.177 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.177 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.177 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.708 183284 DEBUG nova.policy [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41dc6e790bc54fbfaf5c6007d3fa5f63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.835 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.836 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.856 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.857 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.857 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.857 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.985 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.986 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5805MB free_disk=73.37882232666016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.986 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:26 compute-0 nova_compute[183278]: 2026-01-21 18:35:26.987 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:27 compute-0 nova_compute[183278]: 2026-01-21 18:35:27.048 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance 1b617117-9d85-42be-bfd4-9f5228156160 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:35:27 compute-0 nova_compute[183278]: 2026-01-21 18:35:27.048 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:35:27 compute-0 nova_compute[183278]: 2026-01-21 18:35:27.048 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:35:27 compute-0 nova_compute[183278]: 2026-01-21 18:35:27.094 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:35:27 compute-0 nova_compute[183278]: 2026-01-21 18:35:27.114 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:35:27 compute-0 nova_compute[183278]: 2026-01-21 18:35:27.138 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:35:27 compute-0 nova_compute[183278]: 2026-01-21 18:35:27.139 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:27 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:27.771 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:35:27 compute-0 nova_compute[183278]: 2026-01-21 18:35:27.771 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:27 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:27.772 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:35:28 compute-0 nova_compute[183278]: 2026-01-21 18:35:28.742 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:28 compute-0 nova_compute[183278]: 2026-01-21 18:35:28.819 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:28 compute-0 nova_compute[183278]: 2026-01-21 18:35:28.822 183284 DEBUG nova.network.neutron [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Successfully created port: a275d014-1cf3-4d9f-8e50-17958d3ca2a6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:35:29 compute-0 podman[192560]: time="2026-01-21T18:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:35:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:35:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Jan 21 18:35:29 compute-0 nova_compute[183278]: 2026-01-21 18:35:29.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:29 compute-0 nova_compute[183278]: 2026-01-21 18:35:29.815 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:29 compute-0 nova_compute[183278]: 2026-01-21 18:35:29.851 183284 DEBUG nova.network.neutron [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Successfully updated port: a275d014-1cf3-4d9f-8e50-17958d3ca2a6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:35:29 compute-0 nova_compute[183278]: 2026-01-21 18:35:29.867 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:35:29 compute-0 nova_compute[183278]: 2026-01-21 18:35:29.867 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquired lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:35:29 compute-0 nova_compute[183278]: 2026-01-21 18:35:29.868 183284 DEBUG nova.network.neutron [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:35:29 compute-0 nova_compute[183278]: 2026-01-21 18:35:29.944 183284 DEBUG nova.compute.manager [req-d9232d19-d921-45bb-b604-fce2fa063587 req-e7552117-97b0-401e-bb8a-8fdee13d7a08 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-changed-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:35:29 compute-0 nova_compute[183278]: 2026-01-21 18:35:29.945 183284 DEBUG nova.compute.manager [req-d9232d19-d921-45bb-b604-fce2fa063587 req-e7552117-97b0-401e-bb8a-8fdee13d7a08 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Refreshing instance network info cache due to event network-changed-a275d014-1cf3-4d9f-8e50-17958d3ca2a6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:35:29 compute-0 nova_compute[183278]: 2026-01-21 18:35:29.945 183284 DEBUG oslo_concurrency.lockutils [req-d9232d19-d921-45bb-b604-fce2fa063587 req-e7552117-97b0-401e-bb8a-8fdee13d7a08 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:35:30 compute-0 nova_compute[183278]: 2026-01-21 18:35:30.723 183284 DEBUG nova.network.neutron [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:35:30 compute-0 nova_compute[183278]: 2026-01-21 18:35:30.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:31 compute-0 nova_compute[183278]: 2026-01-21 18:35:31.027 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:31 compute-0 nova_compute[183278]: 2026-01-21 18:35:31.109 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:31 compute-0 openstack_network_exporter[195402]: ERROR   18:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:35:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:35:31 compute-0 openstack_network_exporter[195402]: ERROR   18:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:35:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:35:31 compute-0 nova_compute[183278]: 2026-01-21 18:35:31.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.001 183284 DEBUG nova.network.neutron [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Updating instance_info_cache with network_info: [{"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.025 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Releasing lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.026 183284 DEBUG nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Instance network_info: |[{"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.026 183284 DEBUG oslo_concurrency.lockutils [req-d9232d19-d921-45bb-b604-fce2fa063587 req-e7552117-97b0-401e-bb8a-8fdee13d7a08 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.026 183284 DEBUG nova.network.neutron [req-d9232d19-d921-45bb-b604-fce2fa063587 req-e7552117-97b0-401e-bb8a-8fdee13d7a08 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Refreshing network info cache for port a275d014-1cf3-4d9f-8e50-17958d3ca2a6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.028 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Start _get_guest_xml network_info=[{"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.032 183284 WARNING nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.038 183284 DEBUG nova.virt.libvirt.host [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.038 183284 DEBUG nova.virt.libvirt.host [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.041 183284 DEBUG nova.virt.libvirt.host [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.042 183284 DEBUG nova.virt.libvirt.host [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.043 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.043 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.043 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.043 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.044 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.044 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.044 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.044 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.044 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.044 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.045 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.045 183284 DEBUG nova.virt.hardware [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.048 183284 DEBUG nova.virt.libvirt.vif [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-656242378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-656242378',id=23,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-0prerflf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:35:25Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=1b617117-9d85-42be-bfd4-9f5228156160,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.048 183284 DEBUG nova.network.os_vif_util [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.048 183284 DEBUG nova.network.os_vif_util [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:80:d8,bridge_name='br-int',has_traffic_filtering=True,id=a275d014-1cf3-4d9f-8e50-17958d3ca2a6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa275d014-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.049 183284 DEBUG nova.objects.instance [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b617117-9d85-42be-bfd4-9f5228156160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.071 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <uuid>1b617117-9d85-42be-bfd4-9f5228156160</uuid>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <name>instance-00000017</name>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteStrategies-server-656242378</nova:name>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:35:32</nova:creationTime>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:35:32 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:35:32 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:35:32 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:35:32 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:35:32 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:35:32 compute-0 nova_compute[183278]:         <nova:user uuid="41dc6e790bc54fbfaf5c6007d3fa5f63">tempest-TestExecuteStrategies-1753607426-project-member</nova:user>
Jan 21 18:35:32 compute-0 nova_compute[183278]:         <nova:project uuid="fe688847145f4dee992c72dd40bbc1ac">tempest-TestExecuteStrategies-1753607426</nova:project>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:35:32 compute-0 nova_compute[183278]:         <nova:port uuid="a275d014-1cf3-4d9f-8e50-17958d3ca2a6">
Jan 21 18:35:32 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <system>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <entry name="serial">1b617117-9d85-42be-bfd4-9f5228156160</entry>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <entry name="uuid">1b617117-9d85-42be-bfd4-9f5228156160</entry>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     </system>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <os>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   </os>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <features>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   </features>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk.config"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:c8:80:d8"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <target dev="tapa275d014-1c"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/console.log" append="off"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <video>
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     </video>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:35:32 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:35:32 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:35:32 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:35:32 compute-0 nova_compute[183278]: </domain>
Jan 21 18:35:32 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.073 183284 DEBUG nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Preparing to wait for external event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.073 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.073 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.074 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.074 183284 DEBUG nova.virt.libvirt.vif [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-656242378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-656242378',id=23,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-0prerflf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:35:25Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=1b617117-9d85-42be-bfd4-9f5228156160,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.075 183284 DEBUG nova.network.os_vif_util [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converting VIF {"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.075 183284 DEBUG nova.network.os_vif_util [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:80:d8,bridge_name='br-int',has_traffic_filtering=True,id=a275d014-1cf3-4d9f-8e50-17958d3ca2a6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa275d014-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.076 183284 DEBUG os_vif [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:80:d8,bridge_name='br-int',has_traffic_filtering=True,id=a275d014-1cf3-4d9f-8e50-17958d3ca2a6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa275d014-1c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.076 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.076 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.077 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.079 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.079 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa275d014-1c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.080 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa275d014-1c, col_values=(('external_ids', {'iface-id': 'a275d014-1cf3-4d9f-8e50-17958d3ca2a6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:80:d8', 'vm-uuid': '1b617117-9d85-42be-bfd4-9f5228156160'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:35:32 compute-0 NetworkManager[55506]: <info>  [1769020532.0821] manager: (tapa275d014-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.084 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.089 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.090 183284 INFO os_vif [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:80:d8,bridge_name='br-int',has_traffic_filtering=True,id=a275d014-1cf3-4d9f-8e50-17958d3ca2a6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa275d014-1c')
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.129 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.130 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.130 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] No VIF found with MAC fa:16:3e:c8:80:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.130 183284 INFO nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Using config drive
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.870 183284 INFO nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Creating config drive at /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk.config
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.875 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqvujw8y3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:35:32 compute-0 nova_compute[183278]: 2026-01-21 18:35:32.998 183284 DEBUG oslo_concurrency.processutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqvujw8y3" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:35:33 compute-0 kernel: tapa275d014-1c: entered promiscuous mode
Jan 21 18:35:33 compute-0 NetworkManager[55506]: <info>  [1769020533.0516] manager: (tapa275d014-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.052 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:33 compute-0 ovn_controller[95419]: 2026-01-21T18:35:33Z|00166|binding|INFO|Claiming lport a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for this chassis.
Jan 21 18:35:33 compute-0 ovn_controller[95419]: 2026-01-21T18:35:33Z|00167|binding|INFO|a275d014-1cf3-4d9f-8e50-17958d3ca2a6: Claiming fa:16:3e:c8:80:d8 10.100.0.13
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.059 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:80:d8 10.100.0.13'], port_security=['fa:16:3e:c8:80:d8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1b617117-9d85-42be-bfd4-9f5228156160', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=a275d014-1cf3-4d9f-8e50-17958d3ca2a6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.060 104698 INFO neutron.agent.ovn.metadata.agent [-] Port a275d014-1cf3-4d9f-8e50-17958d3ca2a6 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf bound to our chassis
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.061 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:35:33 compute-0 ovn_controller[95419]: 2026-01-21T18:35:33Z|00168|binding|INFO|Setting lport a275d014-1cf3-4d9f-8e50-17958d3ca2a6 up in Southbound
Jan 21 18:35:33 compute-0 ovn_controller[95419]: 2026-01-21T18:35:33Z|00169|binding|INFO|Setting lport a275d014-1cf3-4d9f-8e50-17958d3ca2a6 ovn-installed in OVS
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.064 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.066 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.071 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.073 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4a3d6c77-37d5-47d6-9f63-ac75e891b20b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.074 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap405ec01b-71 in ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.076 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap405ec01b-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.076 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0df61a-af55-47d5-85e6-a1e81a203f4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.077 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[67f0b0a3-0fdd-4767-9dca-92ce7d9929ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 systemd-udevd[211305]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:35:33 compute-0 systemd-machined[154592]: New machine qemu-16-instance-00000017.
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.089 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[77984715-78d0-4dfb-8651-a763a3837ac1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 NetworkManager[55506]: <info>  [1769020533.0947] device (tapa275d014-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:35:33 compute-0 NetworkManager[55506]: <info>  [1769020533.0952] device (tapa275d014-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:35:33 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000017.
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.114 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0798d215-a23b-4ab5-b05e-bb706483cb8b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.147 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[9814cb61-5253-4adf-9f6d-e1c929308436]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.152 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[6d38591e-b74f-449d-8f09-6f5b75e77133]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 NetworkManager[55506]: <info>  [1769020533.1537] manager: (tap405ec01b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Jan 21 18:35:33 compute-0 systemd-udevd[211309]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.182 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[373f60c9-da24-4bc0-892a-2b4c69ee69b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.186 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[0c6b7e22-43b7-4a35-97a1-599084403912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 NetworkManager[55506]: <info>  [1769020533.2087] device (tap405ec01b-70): carrier: link connected
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.214 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[185f74b6-8af5-4964-a787-ba245f8f9c57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.229 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[173cf905-5682-4fc2-8d79-009c4b3374e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509681, 'reachable_time': 25022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211339, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.245 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7f57acb3-a33a-4172-9468-5196023f82b9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:9502'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509681, 'tstamp': 509681}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211341, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.263 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[04641c2b-b82a-4844-afad-e4722fc401e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap405ec01b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:95:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509681, 'reachable_time': 25022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211346, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.296 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e88d978c-46a7-4d2e-b1e7-1814d6901995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.332 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020533.3316429, 1b617117-9d85-42be-bfd4-9f5228156160 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.332 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] VM Started (Lifecycle Event)
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.353 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ed005a-f411-46db-8b21-7171f83dc812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.354 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.354 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.354 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.355 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap405ec01b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.356 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:33 compute-0 NetworkManager[55506]: <info>  [1769020533.3577] manager: (tap405ec01b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Jan 21 18:35:33 compute-0 kernel: tap405ec01b-70: entered promiscuous mode
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.359 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.360 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap405ec01b-70, col_values=(('external_ids', {'iface-id': '9c897ad2-8ce5-4903-8c83-1ed8f117dcdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.361 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020533.334427, 1b617117-9d85-42be-bfd4-9f5228156160 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.361 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] VM Paused (Lifecycle Event)
Jan 21 18:35:33 compute-0 ovn_controller[95419]: 2026-01-21T18:35:33Z|00170|binding|INFO|Releasing lport 9c897ad2-8ce5-4903-8c83-1ed8f117dcdd from this chassis (sb_readonly=0)
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.362 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.377 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.378 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.379 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.380 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8eefb3-7060-4660-8a4c-70e2d81622b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.381 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.pid.haproxy
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID 405ec01b-76d3-4c3c-a31b-5f16d9641fbf
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:35:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:33.381 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'env', 'PROCESS_TAG=haproxy-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/405ec01b-76d3-4c3c-a31b-5f16d9641fbf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.382 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.397 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.552 183284 DEBUG nova.compute.manager [req-1ad104f9-6e26-44fa-99bc-a071b74436bd req-d2221979-8626-480c-91e5-21287c563402 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.553 183284 DEBUG oslo_concurrency.lockutils [req-1ad104f9-6e26-44fa-99bc-a071b74436bd req-d2221979-8626-480c-91e5-21287c563402 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.553 183284 DEBUG oslo_concurrency.lockutils [req-1ad104f9-6e26-44fa-99bc-a071b74436bd req-d2221979-8626-480c-91e5-21287c563402 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.554 183284 DEBUG oslo_concurrency.lockutils [req-1ad104f9-6e26-44fa-99bc-a071b74436bd req-d2221979-8626-480c-91e5-21287c563402 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.554 183284 DEBUG nova.compute.manager [req-1ad104f9-6e26-44fa-99bc-a071b74436bd req-d2221979-8626-480c-91e5-21287c563402 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Processing event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.555 183284 DEBUG nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.557 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020533.5576968, 1b617117-9d85-42be-bfd4-9f5228156160 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.558 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] VM Resumed (Lifecycle Event)
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.577 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.579 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.582 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.586 183284 INFO nova.virt.libvirt.driver [-] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Instance spawned successfully.
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.586 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.615 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.619 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.619 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.620 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.621 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.621 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.622 183284 DEBUG nova.virt.libvirt.driver [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.675 183284 INFO nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Took 7.84 seconds to spawn the instance on the hypervisor.
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.676 183284 DEBUG nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:35:33 compute-0 podman[211379]: 2026-01-21 18:35:33.734902227 +0000 UTC m=+0.044026425 container create ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.742 183284 INFO nova.compute.manager [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Took 8.50 seconds to build instance.
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.765 183284 DEBUG oslo_concurrency.lockutils [None req-0c8b8e00-6012-4906-9ac1-d48072a5a957 41dc6e790bc54fbfaf5c6007d3fa5f63 fe688847145f4dee992c72dd40bbc1ac - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:33 compute-0 systemd[1]: Started libpod-conmon-ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8.scope.
Jan 21 18:35:33 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:35:33 compute-0 podman[211379]: 2026-01-21 18:35:33.710548338 +0000 UTC m=+0.019672556 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:35:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b091a1e4556dbbdaa466ff0ada2c40d1859e3334f154182d4043b28a48075de/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:35:33 compute-0 podman[211379]: 2026-01-21 18:35:33.826450118 +0000 UTC m=+0.135574336 container init ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:35:33 compute-0 podman[211379]: 2026-01-21 18:35:33.832056043 +0000 UTC m=+0.141180241 container start ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 18:35:33 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[211394]: [NOTICE]   (211398) : New worker (211400) forked
Jan 21 18:35:33 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[211394]: [NOTICE]   (211398) : Loading success.
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.940 183284 DEBUG nova.network.neutron [req-d9232d19-d921-45bb-b604-fce2fa063587 req-e7552117-97b0-401e-bb8a-8fdee13d7a08 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Updated VIF entry in instance network info cache for port a275d014-1cf3-4d9f-8e50-17958d3ca2a6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.941 183284 DEBUG nova.network.neutron [req-d9232d19-d921-45bb-b604-fce2fa063587 req-e7552117-97b0-401e-bb8a-8fdee13d7a08 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Updating instance_info_cache with network_info: [{"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:35:33 compute-0 nova_compute[183278]: 2026-01-21 18:35:33.957 183284 DEBUG oslo_concurrency.lockutils [req-d9232d19-d921-45bb-b604-fce2fa063587 req-e7552117-97b0-401e-bb8a-8fdee13d7a08 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:35:35 compute-0 nova_compute[183278]: 2026-01-21 18:35:35.620 183284 DEBUG nova.compute.manager [req-214287c5-16b1-4324-b4f0-9ce9f6198e7e req-da1acbf1-18ab-49cf-bda7-3ec4db808f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:35:35 compute-0 nova_compute[183278]: 2026-01-21 18:35:35.621 183284 DEBUG oslo_concurrency.lockutils [req-214287c5-16b1-4324-b4f0-9ce9f6198e7e req-da1acbf1-18ab-49cf-bda7-3ec4db808f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:35:35 compute-0 nova_compute[183278]: 2026-01-21 18:35:35.621 183284 DEBUG oslo_concurrency.lockutils [req-214287c5-16b1-4324-b4f0-9ce9f6198e7e req-da1acbf1-18ab-49cf-bda7-3ec4db808f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:35:35 compute-0 nova_compute[183278]: 2026-01-21 18:35:35.621 183284 DEBUG oslo_concurrency.lockutils [req-214287c5-16b1-4324-b4f0-9ce9f6198e7e req-da1acbf1-18ab-49cf-bda7-3ec4db808f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:35:35 compute-0 nova_compute[183278]: 2026-01-21 18:35:35.622 183284 DEBUG nova.compute.manager [req-214287c5-16b1-4324-b4f0-9ce9f6198e7e req-da1acbf1-18ab-49cf-bda7-3ec4db808f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] No waiting events found dispatching network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:35:35 compute-0 nova_compute[183278]: 2026-01-21 18:35:35.622 183284 WARNING nova.compute.manager [req-214287c5-16b1-4324-b4f0-9ce9f6198e7e req-da1acbf1-18ab-49cf-bda7-3ec4db808f0b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received unexpected event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for instance with vm_state active and task_state None.
Jan 21 18:35:35 compute-0 nova_compute[183278]: 2026-01-21 18:35:35.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:36 compute-0 nova_compute[183278]: 2026-01-21 18:35:36.110 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:35:36.774 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:35:36 compute-0 nova_compute[183278]: 2026-01-21 18:35:36.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:36 compute-0 nova_compute[183278]: 2026-01-21 18:35:36.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:35:37 compute-0 nova_compute[183278]: 2026-01-21 18:35:37.119 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:41 compute-0 nova_compute[183278]: 2026-01-21 18:35:41.111 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:42 compute-0 nova_compute[183278]: 2026-01-21 18:35:42.122 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:43 compute-0 podman[211410]: 2026-01-21 18:35:43.036788968 +0000 UTC m=+0.091058290 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git)
Jan 21 18:35:45 compute-0 ovn_controller[95419]: 2026-01-21T18:35:45Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:80:d8 10.100.0.13
Jan 21 18:35:45 compute-0 ovn_controller[95419]: 2026-01-21T18:35:45Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:80:d8 10.100.0.13
Jan 21 18:35:46 compute-0 nova_compute[183278]: 2026-01-21 18:35:46.113 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:47 compute-0 nova_compute[183278]: 2026-01-21 18:35:47.125 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:47 compute-0 nova_compute[183278]: 2026-01-21 18:35:47.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:35:50 compute-0 podman[211450]: 2026-01-21 18:35:50.01490535 +0000 UTC m=+0.054541058 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:35:50 compute-0 podman[211449]: 2026-01-21 18:35:50.035001296 +0000 UTC m=+0.081501090 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 21 18:35:51 compute-0 nova_compute[183278]: 2026-01-21 18:35:51.115 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:51 compute-0 sshd-session[211494]: Invalid user geth from 64.227.98.100 port 39252
Jan 21 18:35:51 compute-0 sshd-session[211494]: Connection closed by invalid user geth 64.227.98.100 port 39252 [preauth]
Jan 21 18:35:52 compute-0 nova_compute[183278]: 2026-01-21 18:35:52.164 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:55 compute-0 podman[211496]: 2026-01-21 18:35:55.000638208 +0000 UTC m=+0.050844189 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:35:56 compute-0 nova_compute[183278]: 2026-01-21 18:35:56.117 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:57 compute-0 nova_compute[183278]: 2026-01-21 18:35:57.166 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:35:59 compute-0 podman[192560]: time="2026-01-21T18:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:35:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:35:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2638 "" "Go-http-client/1.1"
Jan 21 18:36:01 compute-0 nova_compute[183278]: 2026-01-21 18:36:01.119 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:01 compute-0 openstack_network_exporter[195402]: ERROR   18:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:36:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:36:01 compute-0 openstack_network_exporter[195402]: ERROR   18:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:36:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:36:02 compute-0 nova_compute[183278]: 2026-01-21 18:36:02.207 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:04 compute-0 nova_compute[183278]: 2026-01-21 18:36:04.725 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:04 compute-0 nova_compute[183278]: 2026-01-21 18:36:04.880 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Triggering sync for uuid 1b617117-9d85-42be-bfd4-9f5228156160 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 18:36:04 compute-0 nova_compute[183278]: 2026-01-21 18:36:04.881 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:04 compute-0 nova_compute[183278]: 2026-01-21 18:36:04.881 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "1b617117-9d85-42be-bfd4-9f5228156160" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:05 compute-0 nova_compute[183278]: 2026-01-21 18:36:05.035 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "1b617117-9d85-42be-bfd4-9f5228156160" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:06 compute-0 nova_compute[183278]: 2026-01-21 18:36:06.120 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:07 compute-0 nova_compute[183278]: 2026-01-21 18:36:07.209 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:10 compute-0 ovn_controller[95419]: 2026-01-21T18:36:10Z|00171|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 18:36:11 compute-0 nova_compute[183278]: 2026-01-21 18:36:11.121 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:12 compute-0 nova_compute[183278]: 2026-01-21 18:36:12.211 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:13 compute-0 podman[211520]: 2026-01-21 18:36:13.991125142 +0000 UTC m=+0.051648209 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 18:36:16 compute-0 nova_compute[183278]: 2026-01-21 18:36:16.124 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:17 compute-0 nova_compute[183278]: 2026-01-21 18:36:17.212 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:20.105 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:20.106 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:20.106 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:21 compute-0 podman[211544]: 2026-01-21 18:36:21.003664178 +0000 UTC m=+0.051338932 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 21 18:36:21 compute-0 podman[211543]: 2026-01-21 18:36:21.023279581 +0000 UTC m=+0.074726886 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:36:21 compute-0 nova_compute[183278]: 2026-01-21 18:36:21.126 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:22 compute-0 nova_compute[183278]: 2026-01-21 18:36:22.213 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:24 compute-0 nova_compute[183278]: 2026-01-21 18:36:24.973 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:24 compute-0 nova_compute[183278]: 2026-01-21 18:36:24.974 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:36:24 compute-0 nova_compute[183278]: 2026-01-21 18:36:24.974 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:36:24 compute-0 nova_compute[183278]: 2026-01-21 18:36:24.999 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:36:24 compute-0 nova_compute[183278]: 2026-01-21 18:36:24.999 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:36:24 compute-0 nova_compute[183278]: 2026-01-21 18:36:24.999 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:36:25 compute-0 nova_compute[183278]: 2026-01-21 18:36:24.999 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1b617117-9d85-42be-bfd4-9f5228156160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:36:25 compute-0 podman[211589]: 2026-01-21 18:36:25.989427806 +0000 UTC m=+0.047515579 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:36:25 compute-0 nova_compute[183278]: 2026-01-21 18:36:25.997 183284 DEBUG nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Check if temp file /var/lib/nova/instances/tmpng_t0wg5 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 18:36:25 compute-0 nova_compute[183278]: 2026-01-21 18:36:25.998 183284 DEBUG nova.compute.manager [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpng_t0wg5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1b617117-9d85-42be-bfd4-9f5228156160',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 18:36:26 compute-0 nova_compute[183278]: 2026-01-21 18:36:26.127 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:27 compute-0 nova_compute[183278]: 2026-01-21 18:36:27.215 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:27 compute-0 nova_compute[183278]: 2026-01-21 18:36:27.686 183284 DEBUG oslo_concurrency.processutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:36:27 compute-0 nova_compute[183278]: 2026-01-21 18:36:27.739 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Updating instance_info_cache with network_info: [{"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:36:27 compute-0 nova_compute[183278]: 2026-01-21 18:36:27.743 183284 DEBUG oslo_concurrency.processutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:36:27 compute-0 nova_compute[183278]: 2026-01-21 18:36:27.744 183284 DEBUG oslo_concurrency.processutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:36:27 compute-0 nova_compute[183278]: 2026-01-21 18:36:27.761 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:36:27 compute-0 nova_compute[183278]: 2026-01-21 18:36:27.761 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:36:27 compute-0 nova_compute[183278]: 2026-01-21 18:36:27.801 183284 DEBUG oslo_concurrency.processutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:36:28 compute-0 nova_compute[183278]: 2026-01-21 18:36:28.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:28 compute-0 nova_compute[183278]: 2026-01-21 18:36:28.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:28 compute-0 nova_compute[183278]: 2026-01-21 18:36:28.842 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:28 compute-0 nova_compute[183278]: 2026-01-21 18:36:28.843 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:28 compute-0 nova_compute[183278]: 2026-01-21 18:36:28.843 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:28 compute-0 nova_compute[183278]: 2026-01-21 18:36:28.844 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:36:28 compute-0 nova_compute[183278]: 2026-01-21 18:36:28.903 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:36:28 compute-0 nova_compute[183278]: 2026-01-21 18:36:28.961 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:36:28 compute-0 nova_compute[183278]: 2026-01-21 18:36:28.962 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.021 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.207 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.208 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5650MB free_disk=73.3501205444336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.208 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.209 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.353 183284 INFO nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Updating resource usage from migration eaef418c-4b38-4e87-83d9-1ff31ac41938
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.386 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Migration eaef418c-4b38-4e87-83d9-1ff31ac41938 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.386 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.386 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.497 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.533 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.560 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:36:29 compute-0 nova_compute[183278]: 2026-01-21 18:36:29.560 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:29 compute-0 podman[192560]: time="2026-01-21T18:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:36:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:36:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Jan 21 18:36:30 compute-0 nova_compute[183278]: 2026-01-21 18:36:30.560 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:31 compute-0 nova_compute[183278]: 2026-01-21 18:36:31.129 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:31 compute-0 sshd-session[211627]: Accepted publickey for nova from 192.168.122.101 port 35772 ssh2: ECDSA SHA256:29a5JNhHHz2bb0ACqZTr6qOKeSRnhiTRA8SK+rzn9gs
Jan 21 18:36:31 compute-0 systemd-logind[782]: New session 40 of user nova.
Jan 21 18:36:31 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:36:31 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:36:31 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:36:31 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:36:31 compute-0 systemd[211631]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:36:31 compute-0 openstack_network_exporter[195402]: ERROR   18:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:36:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:36:31 compute-0 openstack_network_exporter[195402]: ERROR   18:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:36:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:36:31 compute-0 systemd[211631]: Queued start job for default target Main User Target.
Jan 21 18:36:31 compute-0 systemd[211631]: Created slice User Application Slice.
Jan 21 18:36:31 compute-0 systemd[211631]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:36:31 compute-0 systemd[211631]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:36:31 compute-0 systemd[211631]: Reached target Paths.
Jan 21 18:36:31 compute-0 systemd[211631]: Reached target Timers.
Jan 21 18:36:31 compute-0 systemd[211631]: Starting D-Bus User Message Bus Socket...
Jan 21 18:36:31 compute-0 systemd[211631]: Starting Create User's Volatile Files and Directories...
Jan 21 18:36:31 compute-0 systemd[211631]: Finished Create User's Volatile Files and Directories.
Jan 21 18:36:31 compute-0 systemd[211631]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:36:31 compute-0 systemd[211631]: Reached target Sockets.
Jan 21 18:36:31 compute-0 systemd[211631]: Reached target Basic System.
Jan 21 18:36:31 compute-0 systemd[211631]: Reached target Main User Target.
Jan 21 18:36:31 compute-0 systemd[211631]: Startup finished in 135ms.
Jan 21 18:36:31 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:36:31 compute-0 systemd[1]: Started Session 40 of User nova.
Jan 21 18:36:31 compute-0 sshd-session[211627]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:36:31 compute-0 sshd-session[211646]: Received disconnect from 192.168.122.101 port 35772:11: disconnected by user
Jan 21 18:36:31 compute-0 sshd-session[211646]: Disconnected from user nova 192.168.122.101 port 35772
Jan 21 18:36:31 compute-0 sshd-session[211627]: pam_unix(sshd:session): session closed for user nova
Jan 21 18:36:31 compute-0 systemd[1]: session-40.scope: Deactivated successfully.
Jan 21 18:36:31 compute-0 systemd-logind[782]: Session 40 logged out. Waiting for processes to exit.
Jan 21 18:36:31 compute-0 systemd-logind[782]: Removed session 40.
Jan 21 18:36:31 compute-0 nova_compute[183278]: 2026-01-21 18:36:31.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:31 compute-0 nova_compute[183278]: 2026-01-21 18:36:31.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:31 compute-0 nova_compute[183278]: 2026-01-21 18:36:31.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:32 compute-0 nova_compute[183278]: 2026-01-21 18:36:32.218 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:33 compute-0 nova_compute[183278]: 2026-01-21 18:36:33.746 183284 DEBUG nova.compute.manager [req-fa861859-d19a-4f1c-8e2b-412aa7c46e1f req-bf20723f-c6b2-49c7-ac85-7fca149fdf7b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-unplugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:36:33 compute-0 nova_compute[183278]: 2026-01-21 18:36:33.746 183284 DEBUG oslo_concurrency.lockutils [req-fa861859-d19a-4f1c-8e2b-412aa7c46e1f req-bf20723f-c6b2-49c7-ac85-7fca149fdf7b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:33 compute-0 nova_compute[183278]: 2026-01-21 18:36:33.746 183284 DEBUG oslo_concurrency.lockutils [req-fa861859-d19a-4f1c-8e2b-412aa7c46e1f req-bf20723f-c6b2-49c7-ac85-7fca149fdf7b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:33 compute-0 nova_compute[183278]: 2026-01-21 18:36:33.746 183284 DEBUG oslo_concurrency.lockutils [req-fa861859-d19a-4f1c-8e2b-412aa7c46e1f req-bf20723f-c6b2-49c7-ac85-7fca149fdf7b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:33 compute-0 nova_compute[183278]: 2026-01-21 18:36:33.747 183284 DEBUG nova.compute.manager [req-fa861859-d19a-4f1c-8e2b-412aa7c46e1f req-bf20723f-c6b2-49c7-ac85-7fca149fdf7b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] No waiting events found dispatching network-vif-unplugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:36:33 compute-0 nova_compute[183278]: 2026-01-21 18:36:33.747 183284 DEBUG nova.compute.manager [req-fa861859-d19a-4f1c-8e2b-412aa7c46e1f req-bf20723f-c6b2-49c7-ac85-7fca149fdf7b 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-unplugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:36:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:33.777 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:36:33 compute-0 nova_compute[183278]: 2026-01-21 18:36:33.777 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:33.778 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:36:33 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:33.779 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:36:35 compute-0 nova_compute[183278]: 2026-01-21 18:36:35.963 183284 DEBUG nova.compute.manager [req-43e888c1-0d43-488b-8786-5d58407d1fad req-79fae54a-72cc-4688-9089-e5c41c5fad07 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:36:35 compute-0 nova_compute[183278]: 2026-01-21 18:36:35.963 183284 DEBUG oslo_concurrency.lockutils [req-43e888c1-0d43-488b-8786-5d58407d1fad req-79fae54a-72cc-4688-9089-e5c41c5fad07 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:35 compute-0 nova_compute[183278]: 2026-01-21 18:36:35.963 183284 DEBUG oslo_concurrency.lockutils [req-43e888c1-0d43-488b-8786-5d58407d1fad req-79fae54a-72cc-4688-9089-e5c41c5fad07 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:35 compute-0 nova_compute[183278]: 2026-01-21 18:36:35.964 183284 DEBUG oslo_concurrency.lockutils [req-43e888c1-0d43-488b-8786-5d58407d1fad req-79fae54a-72cc-4688-9089-e5c41c5fad07 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:35 compute-0 nova_compute[183278]: 2026-01-21 18:36:35.964 183284 DEBUG nova.compute.manager [req-43e888c1-0d43-488b-8786-5d58407d1fad req-79fae54a-72cc-4688-9089-e5c41c5fad07 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] No waiting events found dispatching network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:36:35 compute-0 nova_compute[183278]: 2026-01-21 18:36:35.964 183284 WARNING nova.compute.manager [req-43e888c1-0d43-488b-8786-5d58407d1fad req-79fae54a-72cc-4688-9089-e5c41c5fad07 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received unexpected event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for instance with vm_state active and task_state migrating.
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.132 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.808 183284 INFO nova.compute.manager [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Took 9.01 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.808 183284 DEBUG nova.compute.manager [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.815 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.822 183284 DEBUG nova.compute.manager [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpng_t0wg5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1b617117-9d85-42be-bfd4-9f5228156160',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(eaef418c-4b38-4e87-83d9-1ff31ac41938),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.844 183284 DEBUG nova.objects.instance [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid 1b617117-9d85-42be-bfd4-9f5228156160 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.845 183284 DEBUG nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.847 183284 DEBUG nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.848 183284 DEBUG nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.864 183284 DEBUG nova.virt.libvirt.vif [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-656242378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-656242378',id=23,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:35:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-0prerflf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:35:33Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=1b617117-9d85-42be-bfd4-9f5228156160,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.864 183284 DEBUG nova.network.os_vif_util [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.865 183284 DEBUG nova.network.os_vif_util [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:80:d8,bridge_name='br-int',has_traffic_filtering=True,id=a275d014-1cf3-4d9f-8e50-17958d3ca2a6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa275d014-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.865 183284 DEBUG nova.virt.libvirt.migration [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 18:36:36 compute-0 nova_compute[183278]:   <mac address="fa:16:3e:c8:80:d8"/>
Jan 21 18:36:36 compute-0 nova_compute[183278]:   <model type="virtio"/>
Jan 21 18:36:36 compute-0 nova_compute[183278]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:36:36 compute-0 nova_compute[183278]:   <mtu size="1442"/>
Jan 21 18:36:36 compute-0 nova_compute[183278]:   <target dev="tapa275d014-1c"/>
Jan 21 18:36:36 compute-0 nova_compute[183278]: </interface>
Jan 21 18:36:36 compute-0 nova_compute[183278]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 18:36:36 compute-0 nova_compute[183278]: 2026-01-21 18:36:36.866 183284 DEBUG nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 18:36:37 compute-0 nova_compute[183278]: 2026-01-21 18:36:37.220 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:37 compute-0 nova_compute[183278]: 2026-01-21 18:36:37.350 183284 DEBUG nova.virt.libvirt.migration [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:36:37 compute-0 nova_compute[183278]: 2026-01-21 18:36:37.351 183284 INFO nova.virt.libvirt.migration [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 18:36:37 compute-0 nova_compute[183278]: 2026-01-21 18:36:37.414 183284 INFO nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 18:36:37 compute-0 nova_compute[183278]: 2026-01-21 18:36:37.918 183284 DEBUG nova.virt.libvirt.migration [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:36:37 compute-0 nova_compute[183278]: 2026-01-21 18:36:37.919 183284 DEBUG nova.virt.libvirt.migration [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:36:38 compute-0 nova_compute[183278]: 2026-01-21 18:36:38.034 183284 DEBUG nova.compute.manager [req-9ae2b3e9-93a0-49c5-a24e-fbe21b29627f req-fde06f99-8e90-4ae3-8c46-ba010d8bed80 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-changed-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:36:38 compute-0 nova_compute[183278]: 2026-01-21 18:36:38.034 183284 DEBUG nova.compute.manager [req-9ae2b3e9-93a0-49c5-a24e-fbe21b29627f req-fde06f99-8e90-4ae3-8c46-ba010d8bed80 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Refreshing instance network info cache due to event network-changed-a275d014-1cf3-4d9f-8e50-17958d3ca2a6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:36:38 compute-0 nova_compute[183278]: 2026-01-21 18:36:38.035 183284 DEBUG oslo_concurrency.lockutils [req-9ae2b3e9-93a0-49c5-a24e-fbe21b29627f req-fde06f99-8e90-4ae3-8c46-ba010d8bed80 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:36:38 compute-0 nova_compute[183278]: 2026-01-21 18:36:38.035 183284 DEBUG oslo_concurrency.lockutils [req-9ae2b3e9-93a0-49c5-a24e-fbe21b29627f req-fde06f99-8e90-4ae3-8c46-ba010d8bed80 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:36:38 compute-0 nova_compute[183278]: 2026-01-21 18:36:38.035 183284 DEBUG nova.network.neutron [req-9ae2b3e9-93a0-49c5-a24e-fbe21b29627f req-fde06f99-8e90-4ae3-8c46-ba010d8bed80 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Refreshing network info cache for port a275d014-1cf3-4d9f-8e50-17958d3ca2a6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:36:38 compute-0 nova_compute[183278]: 2026-01-21 18:36:38.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:36:38 compute-0 nova_compute[183278]: 2026-01-21 18:36:38.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.708 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020599.7080903, 1b617117-9d85-42be-bfd4-9f5228156160 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.709 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] VM Paused (Lifecycle Event)
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.710 183284 DEBUG nova.virt.libvirt.migration [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.711 183284 DEBUG nova.virt.libvirt.migration [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.735 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.741 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.765 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 18:36:39 compute-0 kernel: tapa275d014-1c (unregistering): left promiscuous mode
Jan 21 18:36:39 compute-0 NetworkManager[55506]: <info>  [1769020599.9437] device (tapa275d014-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.948 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:39 compute-0 ovn_controller[95419]: 2026-01-21T18:36:39Z|00172|binding|INFO|Releasing lport a275d014-1cf3-4d9f-8e50-17958d3ca2a6 from this chassis (sb_readonly=0)
Jan 21 18:36:39 compute-0 ovn_controller[95419]: 2026-01-21T18:36:39Z|00173|binding|INFO|Setting lport a275d014-1cf3-4d9f-8e50-17958d3ca2a6 down in Southbound
Jan 21 18:36:39 compute-0 ovn_controller[95419]: 2026-01-21T18:36:39Z|00174|binding|INFO|Removing iface tapa275d014-1c ovn-installed in OVS
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.950 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:39.956 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:80:d8 10.100.0.13'], port_security=['fa:16:3e:c8:80:d8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '88a62794-b4a4-47e3-9cce-91e574e684c1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1b617117-9d85-42be-bfd4-9f5228156160', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe688847145f4dee992c72dd40bbc1ac', 'neutron:revision_number': '8', 'neutron:security_group_ids': '772bc98e-4f63-476c-ac15-1e185ee339f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64b67886-c0d9-40d2-a2d0-cf96a9cd3c14, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=a275d014-1cf3-4d9f-8e50-17958d3ca2a6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:36:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:39.958 104698 INFO neutron.agent.ovn.metadata.agent [-] Port a275d014-1cf3-4d9f-8e50-17958d3ca2a6 in datapath 405ec01b-76d3-4c3c-a31b-5f16d9641fbf unbound from our chassis
Jan 21 18:36:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:39.960 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:36:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:39.978 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[95e053b7-8be5-4113-b741-37f83233230d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:36:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:39.979 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf namespace which is not needed anymore
Jan 21 18:36:39 compute-0 nova_compute[183278]: 2026-01-21 18:36:39.983 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:40 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 21 18:36:40 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000017.scope: Consumed 15.355s CPU time.
Jan 21 18:36:40 compute-0 systemd-machined[154592]: Machine qemu-16-instance-00000017 terminated.
Jan 21 18:36:40 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[211394]: [NOTICE]   (211398) : haproxy version is 2.8.14-c23fe91
Jan 21 18:36:40 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[211394]: [NOTICE]   (211398) : path to executable is /usr/sbin/haproxy
Jan 21 18:36:40 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[211394]: [WARNING]  (211398) : Exiting Master process...
Jan 21 18:36:40 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[211394]: [ALERT]    (211398) : Current worker (211400) exited with code 143 (Terminated)
Jan 21 18:36:40 compute-0 neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf[211394]: [WARNING]  (211398) : All workers exited. Exiting... (0)
Jan 21 18:36:40 compute-0 systemd[1]: libpod-ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8.scope: Deactivated successfully.
Jan 21 18:36:40 compute-0 podman[211683]: 2026-01-21 18:36:40.114772013 +0000 UTC m=+0.044787063 container died ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 18:36:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8-userdata-shm.mount: Deactivated successfully.
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.146 183284 DEBUG nova.compute.manager [req-1d1de3eb-b304-4de6-9d95-b66adcb9bc20 req-ad1a2cbf-0aa5-4d90-aca6-756355c76095 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-unplugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.147 183284 DEBUG oslo_concurrency.lockutils [req-1d1de3eb-b304-4de6-9d95-b66adcb9bc20 req-ad1a2cbf-0aa5-4d90-aca6-756355c76095 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.148 183284 DEBUG oslo_concurrency.lockutils [req-1d1de3eb-b304-4de6-9d95-b66adcb9bc20 req-ad1a2cbf-0aa5-4d90-aca6-756355c76095 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-0b091a1e4556dbbdaa466ff0ada2c40d1859e3334f154182d4043b28a48075de-merged.mount: Deactivated successfully.
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.148 183284 DEBUG oslo_concurrency.lockutils [req-1d1de3eb-b304-4de6-9d95-b66adcb9bc20 req-ad1a2cbf-0aa5-4d90-aca6-756355c76095 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.149 183284 DEBUG nova.compute.manager [req-1d1de3eb-b304-4de6-9d95-b66adcb9bc20 req-ad1a2cbf-0aa5-4d90-aca6-756355c76095 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] No waiting events found dispatching network-vif-unplugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.149 183284 DEBUG nova.compute.manager [req-1d1de3eb-b304-4de6-9d95-b66adcb9bc20 req-ad1a2cbf-0aa5-4d90-aca6-756355c76095 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-unplugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:36:40 compute-0 podman[211683]: 2026-01-21 18:36:40.159848412 +0000 UTC m=+0.089863462 container cleanup ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:36:40 compute-0 systemd[1]: libpod-conmon-ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8.scope: Deactivated successfully.
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.175 183284 DEBUG nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.176 183284 DEBUG nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.176 183284 DEBUG nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.212 183284 DEBUG nova.virt.libvirt.guest [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '1b617117-9d85-42be-bfd4-9f5228156160' (instance-00000017) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.213 183284 INFO nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Migration operation has completed
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.213 183284 INFO nova.compute.manager [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] _post_live_migration() is started..
Jan 21 18:36:40 compute-0 podman[211728]: 2026-01-21 18:36:40.222552867 +0000 UTC m=+0.042372015 container remove ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:36:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:40.227 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6fa539-5f90-4343-9cd7-e8aa4d8232a3]: (4, ('Wed Jan 21 06:36:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8)\nffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8\nWed Jan 21 06:36:40 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf (ffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8)\nffe31573eadd3d38ab43b2009064f6f951e44a46829eb91842dd642b06b5e1d8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:36:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:40.229 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd9af14-8117-4d96-b3ef-028d1285df54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:36:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:40.230 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap405ec01b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.231 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:40 compute-0 kernel: tap405ec01b-70: left promiscuous mode
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.243 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.247 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:40.250 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[981b424e-63d0-4142-82aa-03e27b54361f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:36:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:40.267 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[da810266-ab27-403e-9126-b2d9582392df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:36:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:40.268 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[297f5b3d-52ec-423d-913a-261e5862f9d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.270 183284 DEBUG nova.network.neutron [req-9ae2b3e9-93a0-49c5-a24e-fbe21b29627f req-fde06f99-8e90-4ae3-8c46-ba010d8bed80 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Updated VIF entry in instance network info cache for port a275d014-1cf3-4d9f-8e50-17958d3ca2a6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.270 183284 DEBUG nova.network.neutron [req-9ae2b3e9-93a0-49c5-a24e-fbe21b29627f req-fde06f99-8e90-4ae3-8c46-ba010d8bed80 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Updating instance_info_cache with network_info: [{"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:36:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:40.283 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d54587-9799-432f-ac90-aab3ca85fb31]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509674, 'reachable_time': 41119, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211748, 'error': None, 'target': 'ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:36:40 compute-0 systemd[1]: run-netns-ovnmeta\x2d405ec01b\x2d76d3\x2d4c3c\x2da31b\x2d5f16d9641fbf.mount: Deactivated successfully.
Jan 21 18:36:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:40.287 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-405ec01b-76d3-4c3c-a31b-5f16d9641fbf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:36:40 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:36:40.287 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[173ba08e-c0ff-4bd8-91ad-8e7e78a0f863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:36:40 compute-0 nova_compute[183278]: 2026-01-21 18:36:40.289 183284 DEBUG oslo_concurrency.lockutils [req-9ae2b3e9-93a0-49c5-a24e-fbe21b29627f req-fde06f99-8e90-4ae3-8c46-ba010d8bed80 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-1b617117-9d85-42be-bfd4-9f5228156160" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.134 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.165 183284 DEBUG nova.compute.manager [req-29eaebab-e1ad-4990-ad64-03bc4d4f70a7 req-a3ce6654-183f-4337-8125-5dd4e585407c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-unplugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.166 183284 DEBUG oslo_concurrency.lockutils [req-29eaebab-e1ad-4990-ad64-03bc4d4f70a7 req-a3ce6654-183f-4337-8125-5dd4e585407c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.167 183284 DEBUG oslo_concurrency.lockutils [req-29eaebab-e1ad-4990-ad64-03bc4d4f70a7 req-a3ce6654-183f-4337-8125-5dd4e585407c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.168 183284 DEBUG oslo_concurrency.lockutils [req-29eaebab-e1ad-4990-ad64-03bc4d4f70a7 req-a3ce6654-183f-4337-8125-5dd4e585407c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.168 183284 DEBUG nova.compute.manager [req-29eaebab-e1ad-4990-ad64-03bc4d4f70a7 req-a3ce6654-183f-4337-8125-5dd4e585407c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] No waiting events found dispatching network-vif-unplugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.169 183284 DEBUG nova.compute.manager [req-29eaebab-e1ad-4990-ad64-03bc4d4f70a7 req-a3ce6654-183f-4337-8125-5dd4e585407c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-unplugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.256 183284 DEBUG nova.network.neutron [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Activated binding for port a275d014-1cf3-4d9f-8e50-17958d3ca2a6 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.257 183284 DEBUG nova.compute.manager [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.258 183284 DEBUG nova.virt.libvirt.vif [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-656242378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-656242378',id=23,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:35:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='fe688847145f4dee992c72dd40bbc1ac',ramdisk_id='',reservation_id='r-0prerflf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1753607426',owner_user_name='tempest-TestExecuteStrategies-1753607426-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:36:24Z,user_data=None,user_id='41dc6e790bc54fbfaf5c6007d3fa5f63',uuid=1b617117-9d85-42be-bfd4-9f5228156160,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.258 183284 DEBUG nova.network.os_vif_util [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "address": "fa:16:3e:c8:80:d8", "network": {"id": "405ec01b-76d3-4c3c-a31b-5f16d9641fbf", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-449063233-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fe688847145f4dee992c72dd40bbc1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa275d014-1c", "ovs_interfaceid": "a275d014-1cf3-4d9f-8e50-17958d3ca2a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.259 183284 DEBUG nova.network.os_vif_util [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:80:d8,bridge_name='br-int',has_traffic_filtering=True,id=a275d014-1cf3-4d9f-8e50-17958d3ca2a6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa275d014-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.259 183284 DEBUG os_vif [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:80:d8,bridge_name='br-int',has_traffic_filtering=True,id=a275d014-1cf3-4d9f-8e50-17958d3ca2a6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa275d014-1c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.261 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.262 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa275d014-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.263 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.265 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.267 183284 INFO os_vif [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:80:d8,bridge_name='br-int',has_traffic_filtering=True,id=a275d014-1cf3-4d9f-8e50-17958d3ca2a6,network=Network(405ec01b-76d3-4c3c-a31b-5f16d9641fbf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa275d014-1c')
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.268 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.268 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.269 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.269 183284 DEBUG nova.compute.manager [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.269 183284 INFO nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Deleting instance files /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160_del
Jan 21 18:36:41 compute-0 nova_compute[183278]: 2026-01-21 18:36:41.270 183284 INFO nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Deletion of /var/lib/nova/instances/1b617117-9d85-42be-bfd4-9f5228156160_del complete
Jan 21 18:36:41 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:36:41 compute-0 systemd[211631]: Activating special unit Exit the Session...
Jan 21 18:36:41 compute-0 systemd[211631]: Stopped target Main User Target.
Jan 21 18:36:41 compute-0 systemd[211631]: Stopped target Basic System.
Jan 21 18:36:41 compute-0 systemd[211631]: Stopped target Paths.
Jan 21 18:36:41 compute-0 systemd[211631]: Stopped target Sockets.
Jan 21 18:36:41 compute-0 systemd[211631]: Stopped target Timers.
Jan 21 18:36:41 compute-0 systemd[211631]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:36:41 compute-0 systemd[211631]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:36:41 compute-0 systemd[211631]: Closed D-Bus User Message Bus Socket.
Jan 21 18:36:41 compute-0 systemd[211631]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:36:41 compute-0 systemd[211631]: Removed slice User Application Slice.
Jan 21 18:36:41 compute-0 systemd[211631]: Reached target Shutdown.
Jan 21 18:36:41 compute-0 systemd[211631]: Finished Exit the Session.
Jan 21 18:36:41 compute-0 systemd[211631]: Reached target Exit the Session.
Jan 21 18:36:41 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:36:41 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:36:41 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:36:41 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:36:41 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:36:41 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:36:41 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.228 183284 DEBUG nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.229 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.230 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.230 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.231 183284 DEBUG nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] No waiting events found dispatching network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.231 183284 WARNING nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received unexpected event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for instance with vm_state active and task_state migrating.
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.232 183284 DEBUG nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.232 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.233 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.233 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.234 183284 DEBUG nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] No waiting events found dispatching network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.235 183284 WARNING nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received unexpected event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for instance with vm_state active and task_state migrating.
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.235 183284 DEBUG nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.236 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.236 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.237 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.238 183284 DEBUG nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] No waiting events found dispatching network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.238 183284 WARNING nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received unexpected event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for instance with vm_state active and task_state migrating.
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.239 183284 DEBUG nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.239 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.240 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.241 183284 DEBUG oslo_concurrency.lockutils [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.241 183284 DEBUG nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] No waiting events found dispatching network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:36:42 compute-0 nova_compute[183278]: 2026-01-21 18:36:42.242 183284 WARNING nova.compute.manager [req-cfe192a7-7bed-48ff-9720-65459fee6be4 req-6e3a4070-5aa2-415b-a53c-9f45407c2f3c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Received unexpected event network-vif-plugged-a275d014-1cf3-4d9f-8e50-17958d3ca2a6 for instance with vm_state active and task_state migrating.
Jan 21 18:36:45 compute-0 podman[211750]: 2026-01-21 18:36:45.01143345 +0000 UTC m=+0.065510313 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.058 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1b617117-9d85-42be-bfd4-9f5228156160-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.058 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.059 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1b617117-9d85-42be-bfd4-9f5228156160-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.083 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.083 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.083 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.083 183284 DEBUG nova.compute.resource_tracker [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.138 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.227 183284 WARNING nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.228 183284 DEBUG nova.compute.resource_tracker [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5818MB free_disk=73.3789176940918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.229 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.229 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.263 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.271 183284 DEBUG nova.compute.resource_tracker [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration for instance 1b617117-9d85-42be-bfd4-9f5228156160 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.292 183284 DEBUG nova.compute.resource_tracker [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.329 183284 DEBUG nova.compute.resource_tracker [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration eaef418c-4b38-4e87-83d9-1ff31ac41938 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.329 183284 DEBUG nova.compute.resource_tracker [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.329 183284 DEBUG nova.compute.resource_tracker [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.371 183284 DEBUG nova.compute.provider_tree [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.411 183284 DEBUG nova.scheduler.client.report [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.638 183284 DEBUG nova.compute.resource_tracker [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.638 183284 DEBUG oslo_concurrency.lockutils [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.644 183284 INFO nova.compute.manager [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.918 183284 INFO nova.scheduler.client.report [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Deleted allocation for migration eaef418c-4b38-4e87-83d9-1ff31ac41938
Jan 21 18:36:46 compute-0 nova_compute[183278]: 2026-01-21 18:36:46.918 183284 DEBUG nova.virt.libvirt.driver [None req-54e8f3a4-ba4f-4056-b138-d2a72deefcde 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 18:36:51 compute-0 nova_compute[183278]: 2026-01-21 18:36:51.139 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:51 compute-0 nova_compute[183278]: 2026-01-21 18:36:51.264 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:52 compute-0 podman[211773]: 2026-01-21 18:36:52.028563686 +0000 UTC m=+0.079945242 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 21 18:36:52 compute-0 podman[211772]: 2026-01-21 18:36:52.053492948 +0000 UTC m=+0.101575745 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:36:55 compute-0 nova_compute[183278]: 2026-01-21 18:36:55.174 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020600.1735656, 1b617117-9d85-42be-bfd4-9f5228156160 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:36:55 compute-0 nova_compute[183278]: 2026-01-21 18:36:55.175 183284 INFO nova.compute.manager [-] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] VM Stopped (Lifecycle Event)
Jan 21 18:36:55 compute-0 nova_compute[183278]: 2026-01-21 18:36:55.202 183284 DEBUG nova.compute.manager [None req-0901f104-f74c-42e3-a265-9c26f1c403b6 - - - - - -] [instance: 1b617117-9d85-42be-bfd4-9f5228156160] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:36:56 compute-0 nova_compute[183278]: 2026-01-21 18:36:56.142 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:56 compute-0 nova_compute[183278]: 2026-01-21 18:36:56.266 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:36:56 compute-0 podman[211818]: 2026-01-21 18:36:56.98933361 +0000 UTC m=+0.048582483 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:36:59 compute-0 podman[192560]: time="2026-01-21T18:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:36:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:36:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 21 18:37:01 compute-0 nova_compute[183278]: 2026-01-21 18:37:01.143 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:01 compute-0 nova_compute[183278]: 2026-01-21 18:37:01.268 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:01 compute-0 openstack_network_exporter[195402]: ERROR   18:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:37:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:37:01 compute-0 openstack_network_exporter[195402]: ERROR   18:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:37:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:37:06 compute-0 nova_compute[183278]: 2026-01-21 18:37:06.146 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:06 compute-0 nova_compute[183278]: 2026-01-21 18:37:06.270 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:11 compute-0 nova_compute[183278]: 2026-01-21 18:37:11.146 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:11 compute-0 nova_compute[183278]: 2026-01-21 18:37:11.271 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:16 compute-0 podman[211842]: 2026-01-21 18:37:16.013418357 +0000 UTC m=+0.058690259 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, architecture=x86_64, vcs-type=git, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 18:37:16 compute-0 nova_compute[183278]: 2026-01-21 18:37:16.147 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:16 compute-0 nova_compute[183278]: 2026-01-21 18:37:16.273 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:37:20.106 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:37:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:37:20.107 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:37:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:37:20.107 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:37:21 compute-0 nova_compute[183278]: 2026-01-21 18:37:21.148 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:21 compute-0 nova_compute[183278]: 2026-01-21 18:37:21.277 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:22 compute-0 podman[211866]: 2026-01-21 18:37:22.993455186 +0000 UTC m=+0.048166935 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 18:37:23 compute-0 podman[211865]: 2026-01-21 18:37:23.022378784 +0000 UTC m=+0.080238989 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:37:23 compute-0 ovn_controller[95419]: 2026-01-21T18:37:23Z|00175|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 18:37:24 compute-0 nova_compute[183278]: 2026-01-21 18:37:24.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:24 compute-0 nova_compute[183278]: 2026-01-21 18:37:24.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:37:24 compute-0 nova_compute[183278]: 2026-01-21 18:37:24.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:37:24 compute-0 nova_compute[183278]: 2026-01-21 18:37:24.840 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:37:26 compute-0 nova_compute[183278]: 2026-01-21 18:37:26.150 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:26 compute-0 nova_compute[183278]: 2026-01-21 18:37:26.279 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:27 compute-0 podman[211910]: 2026-01-21 18:37:27.996439081 +0000 UTC m=+0.055489452 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:37:28 compute-0 nova_compute[183278]: 2026-01-21 18:37:28.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:28 compute-0 nova_compute[183278]: 2026-01-21 18:37:28.934 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:37:28 compute-0 nova_compute[183278]: 2026-01-21 18:37:28.934 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:37:28 compute-0 nova_compute[183278]: 2026-01-21 18:37:28.934 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:37:28 compute-0 nova_compute[183278]: 2026-01-21 18:37:28.935 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.071 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.119 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.120 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5827MB free_disk=73.3789176940918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.121 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.121 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.381 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.382 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.403 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.424 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.425 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:37:29 compute-0 nova_compute[183278]: 2026-01-21 18:37:29.425 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:37:29 compute-0 podman[192560]: time="2026-01-21T18:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:37:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:37:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 21 18:37:31 compute-0 nova_compute[183278]: 2026-01-21 18:37:31.151 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:31 compute-0 nova_compute[183278]: 2026-01-21 18:37:31.328 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:31 compute-0 openstack_network_exporter[195402]: ERROR   18:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:37:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:37:31 compute-0 openstack_network_exporter[195402]: ERROR   18:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:37:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:37:31 compute-0 nova_compute[183278]: 2026-01-21 18:37:31.426 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:31 compute-0 nova_compute[183278]: 2026-01-21 18:37:31.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:31 compute-0 nova_compute[183278]: 2026-01-21 18:37:31.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:31 compute-0 nova_compute[183278]: 2026-01-21 18:37:31.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:32 compute-0 nova_compute[183278]: 2026-01-21 18:37:32.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:35 compute-0 nova_compute[183278]: 2026-01-21 18:37:35.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:36 compute-0 nova_compute[183278]: 2026-01-21 18:37:36.152 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:36 compute-0 nova_compute[183278]: 2026-01-21 18:37:36.331 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:36 compute-0 nova_compute[183278]: 2026-01-21 18:37:36.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:38 compute-0 nova_compute[183278]: 2026-01-21 18:37:38.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:37:38 compute-0 nova_compute[183278]: 2026-01-21 18:37:38.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:37:41 compute-0 nova_compute[183278]: 2026-01-21 18:37:41.158 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:41 compute-0 nova_compute[183278]: 2026-01-21 18:37:41.334 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:42 compute-0 nova_compute[183278]: 2026-01-21 18:37:42.040 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:42 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:37:42.047 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:37:42 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:37:42.048 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:37:46 compute-0 nova_compute[183278]: 2026-01-21 18:37:46.157 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:46 compute-0 nova_compute[183278]: 2026-01-21 18:37:46.336 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:47 compute-0 podman[211935]: 2026-01-21 18:37:47.030670237 +0000 UTC m=+0.085981519 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter)
Jan 21 18:37:48 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:37:48.049 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:37:51 compute-0 nova_compute[183278]: 2026-01-21 18:37:51.158 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:51 compute-0 nova_compute[183278]: 2026-01-21 18:37:51.338 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:53 compute-0 podman[211958]: 2026-01-21 18:37:53.993145819 +0000 UTC m=+0.048214986 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 18:37:54 compute-0 podman[211957]: 2026-01-21 18:37:54.01920151 +0000 UTC m=+0.078802156 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 21 18:37:56 compute-0 nova_compute[183278]: 2026-01-21 18:37:56.159 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:56 compute-0 nova_compute[183278]: 2026-01-21 18:37:56.339 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:37:58 compute-0 podman[212001]: 2026-01-21 18:37:58.993354786 +0000 UTC m=+0.047438288 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:37:59 compute-0 nova_compute[183278]: 2026-01-21 18:37:59.148 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "20fd0a46-611d-4fcb-942c-11b291dfbaad" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:37:59 compute-0 nova_compute[183278]: 2026-01-21 18:37:59.149 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:37:59 compute-0 nova_compute[183278]: 2026-01-21 18:37:59.176 183284 DEBUG nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:37:59 compute-0 nova_compute[183278]: 2026-01-21 18:37:59.247 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:37:59 compute-0 nova_compute[183278]: 2026-01-21 18:37:59.248 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:00 compute-0 podman[192560]: time="2026-01-21T18:38:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:38:00 compute-0 podman[192560]: @ - - [21/Jan/2026:18:38:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.372 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.372 183284 INFO nova.compute.claims [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:38:00 compute-0 podman[192560]: @ - - [21/Jan/2026:18:38:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.475 183284 DEBUG nova.compute.provider_tree [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.493 183284 DEBUG nova.scheduler.client.report [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.514 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.515 183284 DEBUG nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.574 183284 DEBUG nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.575 183284 DEBUG nova.network.neutron [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.604 183284 INFO nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.620 183284 DEBUG nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.720 183284 DEBUG nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.721 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.722 183284 INFO nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Creating image(s)
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.723 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "/var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.723 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "/var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.724 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "/var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.736 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.790 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.791 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.792 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.802 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.834 183284 DEBUG nova.policy [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '61bd089d194f4cf380e1a5f0c92c9c62', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '55c360f5fa1e445396df9c0ba67fb46d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.856 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.857 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:38:00 compute-0 sshd-session[212025]: Invalid user ethereum from 64.227.98.100 port 57778
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.887 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.888 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.888 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.940 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.941 183284 DEBUG nova.virt.disk.api [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Checking if we can resize image /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:38:00 compute-0 nova_compute[183278]: 2026-01-21 18:38:00.942 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:38:00 compute-0 sshd-session[212025]: Connection closed by invalid user ethereum 64.227.98.100 port 57778 [preauth]
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.002 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.003 183284 DEBUG nova.virt.disk.api [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Cannot resize image /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.003 183284 DEBUG nova.objects.instance [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lazy-loading 'migration_context' on Instance uuid 20fd0a46-611d-4fcb-942c-11b291dfbaad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.019 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.019 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Ensure instance console log exists: /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.020 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.020 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.020 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.211 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:01 compute-0 nova_compute[183278]: 2026-01-21 18:38:01.340 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:01 compute-0 openstack_network_exporter[195402]: ERROR   18:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:38:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:38:01 compute-0 openstack_network_exporter[195402]: ERROR   18:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:38:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:38:02 compute-0 nova_compute[183278]: 2026-01-21 18:38:02.033 183284 DEBUG nova.network.neutron [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Successfully created port: 40abab10-895c-40a6-b87b-12fced1a5b22 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:38:03 compute-0 nova_compute[183278]: 2026-01-21 18:38:03.587 183284 DEBUG nova.network.neutron [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Successfully updated port: 40abab10-895c-40a6-b87b-12fced1a5b22 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:38:03 compute-0 nova_compute[183278]: 2026-01-21 18:38:03.687 183284 DEBUG nova.compute.manager [req-1d039c22-f479-4e54-9357-f760980d1484 req-be9ccb0a-d201-433a-acbf-f5bddfb08b59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Received event network-changed-40abab10-895c-40a6-b87b-12fced1a5b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:38:03 compute-0 nova_compute[183278]: 2026-01-21 18:38:03.687 183284 DEBUG nova.compute.manager [req-1d039c22-f479-4e54-9357-f760980d1484 req-be9ccb0a-d201-433a-acbf-f5bddfb08b59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Refreshing instance network info cache due to event network-changed-40abab10-895c-40a6-b87b-12fced1a5b22. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:38:03 compute-0 nova_compute[183278]: 2026-01-21 18:38:03.688 183284 DEBUG oslo_concurrency.lockutils [req-1d039c22-f479-4e54-9357-f760980d1484 req-be9ccb0a-d201-433a-acbf-f5bddfb08b59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:38:03 compute-0 nova_compute[183278]: 2026-01-21 18:38:03.688 183284 DEBUG oslo_concurrency.lockutils [req-1d039c22-f479-4e54-9357-f760980d1484 req-be9ccb0a-d201-433a-acbf-f5bddfb08b59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:38:03 compute-0 nova_compute[183278]: 2026-01-21 18:38:03.688 183284 DEBUG nova.network.neutron [req-1d039c22-f479-4e54-9357-f760980d1484 req-be9ccb0a-d201-433a-acbf-f5bddfb08b59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Refreshing network info cache for port 40abab10-895c-40a6-b87b-12fced1a5b22 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:38:03 compute-0 nova_compute[183278]: 2026-01-21 18:38:03.690 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:38:03 compute-0 nova_compute[183278]: 2026-01-21 18:38:03.918 183284 DEBUG nova.network.neutron [req-1d039c22-f479-4e54-9357-f760980d1484 req-be9ccb0a-d201-433a-acbf-f5bddfb08b59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:38:04 compute-0 ovn_controller[95419]: 2026-01-21T18:38:04Z|00176|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 18:38:04 compute-0 nova_compute[183278]: 2026-01-21 18:38:04.210 183284 DEBUG nova.network.neutron [req-1d039c22-f479-4e54-9357-f760980d1484 req-be9ccb0a-d201-433a-acbf-f5bddfb08b59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:38:04 compute-0 nova_compute[183278]: 2026-01-21 18:38:04.225 183284 DEBUG oslo_concurrency.lockutils [req-1d039c22-f479-4e54-9357-f760980d1484 req-be9ccb0a-d201-433a-acbf-f5bddfb08b59 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:38:04 compute-0 nova_compute[183278]: 2026-01-21 18:38:04.226 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquired lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:38:04 compute-0 nova_compute[183278]: 2026-01-21 18:38:04.226 183284 DEBUG nova.network.neutron [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:38:04 compute-0 nova_compute[183278]: 2026-01-21 18:38:04.812 183284 DEBUG nova.network.neutron [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.009 183284 DEBUG nova.network.neutron [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Updating instance_info_cache with network_info: [{"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.034 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Releasing lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.034 183284 DEBUG nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Instance network_info: |[{"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.036 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Start _get_guest_xml network_info=[{"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.041 183284 WARNING nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.047 183284 DEBUG nova.virt.libvirt.host [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.047 183284 DEBUG nova.virt.libvirt.host [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.051 183284 DEBUG nova.virt.libvirt.host [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.052 183284 DEBUG nova.virt.libvirt.host [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.053 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.053 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.053 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.054 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.054 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.054 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.054 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.055 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.055 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.055 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.055 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.056 183284 DEBUG nova.virt.hardware [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.060 183284 DEBUG nova.virt.libvirt.vif [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:37:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1706326110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1706326110',id=25,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55c360f5fa1e445396df9c0ba67fb46d',ramdisk_id='',reservation_id='r-wtrayq0d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:38:00Z,user_data=None,user_id='61bd089d194f4cf380e1a5f0c92c9c62',uuid=20fd0a46-611d-4fcb-942c-11b291dfbaad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.060 183284 DEBUG nova.network.os_vif_util [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Converting VIF {"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.061 183284 DEBUG nova.network.os_vif_util [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:ba:be,bridge_name='br-int',has_traffic_filtering=True,id=40abab10-895c-40a6-b87b-12fced1a5b22,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40abab10-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.061 183284 DEBUG nova.objects.instance [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lazy-loading 'pci_devices' on Instance uuid 20fd0a46-611d-4fcb-942c-11b291dfbaad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.075 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <uuid>20fd0a46-611d-4fcb-942c-11b291dfbaad</uuid>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <name>instance-00000019</name>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1706326110</nova:name>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:38:06</nova:creationTime>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:38:06 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:38:06 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:38:06 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:38:06 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:38:06 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:38:06 compute-0 nova_compute[183278]:         <nova:user uuid="61bd089d194f4cf380e1a5f0c92c9c62">tempest-TestExecuteVmWorkloadBalanceStrategy-145388266-project-member</nova:user>
Jan 21 18:38:06 compute-0 nova_compute[183278]:         <nova:project uuid="55c360f5fa1e445396df9c0ba67fb46d">tempest-TestExecuteVmWorkloadBalanceStrategy-145388266</nova:project>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:38:06 compute-0 nova_compute[183278]:         <nova:port uuid="40abab10-895c-40a6-b87b-12fced1a5b22">
Jan 21 18:38:06 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <system>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <entry name="serial">20fd0a46-611d-4fcb-942c-11b291dfbaad</entry>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <entry name="uuid">20fd0a46-611d-4fcb-942c-11b291dfbaad</entry>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     </system>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <os>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   </os>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <features>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   </features>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk.config"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:66:ba:be"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <target dev="tap40abab10-89"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/console.log" append="off"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <video>
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     </video>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:38:06 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:38:06 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:38:06 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:38:06 compute-0 nova_compute[183278]: </domain>
Jan 21 18:38:06 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.077 183284 DEBUG nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Preparing to wait for external event network-vif-plugged-40abab10-895c-40a6-b87b-12fced1a5b22 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.077 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.077 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.078 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.079 183284 DEBUG nova.virt.libvirt.vif [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:37:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1706326110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1706326110',id=25,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55c360f5fa1e445396df9c0ba67fb46d',ramdisk_id='',reservation_id='r-wtrayq0d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:38:00Z,user_data=None,user_id='61bd089d194f4cf380e1a5f0c92c9c62',uuid=20fd0a46-611d-4fcb-942c-11b291dfbaad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.079 183284 DEBUG nova.network.os_vif_util [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Converting VIF {"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.080 183284 DEBUG nova.network.os_vif_util [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:ba:be,bridge_name='br-int',has_traffic_filtering=True,id=40abab10-895c-40a6-b87b-12fced1a5b22,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40abab10-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.080 183284 DEBUG os_vif [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:ba:be,bridge_name='br-int',has_traffic_filtering=True,id=40abab10-895c-40a6-b87b-12fced1a5b22,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40abab10-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.081 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.081 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.082 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.086 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.086 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40abab10-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.087 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap40abab10-89, col_values=(('external_ids', {'iface-id': '40abab10-895c-40a6-b87b-12fced1a5b22', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:ba:be', 'vm-uuid': '20fd0a46-611d-4fcb-942c-11b291dfbaad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:38:06 compute-0 NetworkManager[55506]: <info>  [1769020686.0901] manager: (tap40abab10-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.091 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.094 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.095 183284 INFO os_vif [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:ba:be,bridge_name='br-int',has_traffic_filtering=True,id=40abab10-895c-40a6-b87b-12fced1a5b22,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40abab10-89')
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.214 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.235 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.235 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.235 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] No VIF found with MAC fa:16:3e:66:ba:be, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.236 183284 INFO nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Using config drive
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.928 183284 INFO nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Creating config drive at /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk.config
Jan 21 18:38:06 compute-0 nova_compute[183278]: 2026-01-21 18:38:06.933 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpost4i4uu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.055 183284 DEBUG oslo_concurrency.processutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpost4i4uu" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:38:07 compute-0 kernel: tap40abab10-89: entered promiscuous mode
Jan 21 18:38:07 compute-0 NetworkManager[55506]: <info>  [1769020687.1149] manager: (tap40abab10-89): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Jan 21 18:38:07 compute-0 ovn_controller[95419]: 2026-01-21T18:38:07Z|00177|binding|INFO|Claiming lport 40abab10-895c-40a6-b87b-12fced1a5b22 for this chassis.
Jan 21 18:38:07 compute-0 ovn_controller[95419]: 2026-01-21T18:38:07Z|00178|binding|INFO|40abab10-895c-40a6-b87b-12fced1a5b22: Claiming fa:16:3e:66:ba:be 10.100.0.14
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.115 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.119 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.122 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.131 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:ba:be 10.100.0.14'], port_security=['fa:16:3e:66:ba:be 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '20fd0a46-611d-4fcb-942c-11b291dfbaad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bdeafc54-899b-4c66-8810-1da93d23a91b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55c360f5fa1e445396df9c0ba67fb46d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2123a8b9-e62b-4182-97a1-e14b1b826b02', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcc99376-1fa1-4201-9a47-c665a9504862, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=40abab10-895c-40a6-b87b-12fced1a5b22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.132 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 40abab10-895c-40a6-b87b-12fced1a5b22 in datapath bdeafc54-899b-4c66-8810-1da93d23a91b bound to our chassis
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.133 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bdeafc54-899b-4c66-8810-1da93d23a91b
Jan 21 18:38:07 compute-0 systemd-udevd[212062]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.143 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[cf051d7f-0c35-48dd-a9ff-d14543f4be02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.144 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbdeafc54-81 in ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:38:07 compute-0 systemd-machined[154592]: New machine qemu-17-instance-00000019.
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.146 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbdeafc54-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.146 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7098bdcb-d4bd-4eda-ab5a-01d7e088cdc8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.147 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b67492a7-e8c2-4f03-be10-85c7b7b8041d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 NetworkManager[55506]: <info>  [1769020687.1524] device (tap40abab10-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:38:07 compute-0 NetworkManager[55506]: <info>  [1769020687.1532] device (tap40abab10-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.158 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c4b42f-ecd1-4403-92d2-165d66528188]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000019.
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.172 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0c16cc95-064c-4dcc-9704-41a9e606b8b5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.173 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:07 compute-0 ovn_controller[95419]: 2026-01-21T18:38:07Z|00179|binding|INFO|Setting lport 40abab10-895c-40a6-b87b-12fced1a5b22 ovn-installed in OVS
Jan 21 18:38:07 compute-0 ovn_controller[95419]: 2026-01-21T18:38:07Z|00180|binding|INFO|Setting lport 40abab10-895c-40a6-b87b-12fced1a5b22 up in Southbound
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.178 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.204 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[916233ca-9636-40e4-8172-d17f0350161f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 NetworkManager[55506]: <info>  [1769020687.2092] manager: (tapbdeafc54-80): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Jan 21 18:38:07 compute-0 systemd-udevd[212066]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.209 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd90d5f-42d0-4a36-867e-7e80a36d8552]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.237 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d2d988-5c14-4840-be4c-fbf2b8f0b85a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.241 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[4fdc7a36-7b12-4ec3-83d8-002fbb4ce9f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 NetworkManager[55506]: <info>  [1769020687.2614] device (tapbdeafc54-80): carrier: link connected
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.266 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[86896d6a-5aa9-4281-a887-27d3f7e069a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.282 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4210aa3e-71cb-4058-a730-194810f110b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbdeafc54-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:ad:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525086, 'reachable_time': 34558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212095, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.298 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4a08600a-8d35-4f43-90f9-7dc8c178466a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:ad73'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525086, 'tstamp': 525086}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212096, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.316 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0095637a-b762-41b6-a3b7-a7be68ed0709]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbdeafc54-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:ad:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525086, 'reachable_time': 34558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212097, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.349 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc59b27-d8a0-4515-ab09-702feb26c0d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.413 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b58373dc-fe16-45e4-89de-932b6e6768d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.414 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbdeafc54-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.414 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.415 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbdeafc54-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:38:07 compute-0 kernel: tapbdeafc54-80: entered promiscuous mode
Jan 21 18:38:07 compute-0 NetworkManager[55506]: <info>  [1769020687.4173] manager: (tapbdeafc54-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.416 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.418 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbdeafc54-80, col_values=(('external_ids', {'iface-id': '852717d2-9d24-4348-bdea-ddbd3580930b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:38:07 compute-0 ovn_controller[95419]: 2026-01-21T18:38:07Z|00181|binding|INFO|Releasing lport 852717d2-9d24-4348-bdea-ddbd3580930b from this chassis (sb_readonly=0)
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.423 183284 DEBUG nova.compute.manager [req-eb0c9ee7-3737-4650-b6e2-bbeea453c108 req-d3aa6f3e-eee7-4ed2-9c58-7ec6942c7a55 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Received event network-vif-plugged-40abab10-895c-40a6-b87b-12fced1a5b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.424 183284 DEBUG oslo_concurrency.lockutils [req-eb0c9ee7-3737-4650-b6e2-bbeea453c108 req-d3aa6f3e-eee7-4ed2-9c58-7ec6942c7a55 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.424 183284 DEBUG oslo_concurrency.lockutils [req-eb0c9ee7-3737-4650-b6e2-bbeea453c108 req-d3aa6f3e-eee7-4ed2-9c58-7ec6942c7a55 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.424 183284 DEBUG oslo_concurrency.lockutils [req-eb0c9ee7-3737-4650-b6e2-bbeea453c108 req-d3aa6f3e-eee7-4ed2-9c58-7ec6942c7a55 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.425 183284 DEBUG nova.compute.manager [req-eb0c9ee7-3737-4650-b6e2-bbeea453c108 req-d3aa6f3e-eee7-4ed2-9c58-7ec6942c7a55 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Processing event network-vif-plugged-40abab10-895c-40a6-b87b-12fced1a5b22 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:38:07 compute-0 nova_compute[183278]: 2026-01-21 18:38:07.430 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.430 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bdeafc54-899b-4c66-8810-1da93d23a91b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bdeafc54-899b-4c66-8810-1da93d23a91b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.431 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[75423383-530f-4d2f-9645-6a13c08e50d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.432 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-bdeafc54-899b-4c66-8810-1da93d23a91b
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/bdeafc54-899b-4c66-8810-1da93d23a91b.pid.haproxy
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID bdeafc54-899b-4c66-8810-1da93d23a91b
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:38:07 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:07.433 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'env', 'PROCESS_TAG=haproxy-bdeafc54-899b-4c66-8810-1da93d23a91b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bdeafc54-899b-4c66-8810-1da93d23a91b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:38:07 compute-0 podman[212129]: 2026-01-21 18:38:07.766175101 +0000 UTC m=+0.045793339 container create 9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 18:38:07 compute-0 systemd[1]: Started libpod-conmon-9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28.scope.
Jan 21 18:38:07 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:38:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d0b5a8b9751a0b473d94ed82df4fda028c4684a1f48258c8899ea2a681a1b8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:38:07 compute-0 podman[212129]: 2026-01-21 18:38:07.743587035 +0000 UTC m=+0.023205263 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:38:07 compute-0 podman[212129]: 2026-01-21 18:38:07.84516585 +0000 UTC m=+0.124784108 container init 9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 18:38:07 compute-0 podman[212129]: 2026-01-21 18:38:07.853218995 +0000 UTC m=+0.132837223 container start 9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:38:07 compute-0 neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b[212144]: [NOTICE]   (212148) : New worker (212150) forked
Jan 21 18:38:07 compute-0 neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b[212144]: [NOTICE]   (212148) : Loading success.
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.147 183284 DEBUG nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.148 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020688.1483638, 20fd0a46-611d-4fcb-942c-11b291dfbaad => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.149 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] VM Started (Lifecycle Event)
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.152 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.157 183284 INFO nova.virt.libvirt.driver [-] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Instance spawned successfully.
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.158 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.528 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.536 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.540 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.541 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.542 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.542 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.542 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.543 183284 DEBUG nova.virt.libvirt.driver [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.556 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.556 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020688.1489375, 20fd0a46-611d-4fcb-942c-11b291dfbaad => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.557 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] VM Paused (Lifecycle Event)
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.579 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.583 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020688.151101, 20fd0a46-611d-4fcb-942c-11b291dfbaad => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.584 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] VM Resumed (Lifecycle Event)
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.605 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.609 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.614 183284 INFO nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Took 7.89 seconds to spawn the instance on the hypervisor.
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.615 183284 DEBUG nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.632 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.679 183284 INFO nova.compute.manager [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Took 9.46 seconds to build instance.
Jan 21 18:38:08 compute-0 nova_compute[183278]: 2026-01-21 18:38:08.695 183284 DEBUG oslo_concurrency.lockutils [None req-3c9de751-f36c-4269-8f91-5ab6639fd26c 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:09 compute-0 nova_compute[183278]: 2026-01-21 18:38:09.494 183284 DEBUG nova.compute.manager [req-b4fc6c38-bb75-4f10-b2e8-ddd93bf2e45e req-b6baaa0d-331c-4e13-a54f-7fb8c3b95761 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Received event network-vif-plugged-40abab10-895c-40a6-b87b-12fced1a5b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:38:09 compute-0 nova_compute[183278]: 2026-01-21 18:38:09.494 183284 DEBUG oslo_concurrency.lockutils [req-b4fc6c38-bb75-4f10-b2e8-ddd93bf2e45e req-b6baaa0d-331c-4e13-a54f-7fb8c3b95761 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:38:09 compute-0 nova_compute[183278]: 2026-01-21 18:38:09.496 183284 DEBUG oslo_concurrency.lockutils [req-b4fc6c38-bb75-4f10-b2e8-ddd93bf2e45e req-b6baaa0d-331c-4e13-a54f-7fb8c3b95761 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:09 compute-0 nova_compute[183278]: 2026-01-21 18:38:09.496 183284 DEBUG oslo_concurrency.lockutils [req-b4fc6c38-bb75-4f10-b2e8-ddd93bf2e45e req-b6baaa0d-331c-4e13-a54f-7fb8c3b95761 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:09 compute-0 nova_compute[183278]: 2026-01-21 18:38:09.496 183284 DEBUG nova.compute.manager [req-b4fc6c38-bb75-4f10-b2e8-ddd93bf2e45e req-b6baaa0d-331c-4e13-a54f-7fb8c3b95761 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] No waiting events found dispatching network-vif-plugged-40abab10-895c-40a6-b87b-12fced1a5b22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:38:09 compute-0 nova_compute[183278]: 2026-01-21 18:38:09.497 183284 WARNING nova.compute.manager [req-b4fc6c38-bb75-4f10-b2e8-ddd93bf2e45e req-b6baaa0d-331c-4e13-a54f-7fb8c3b95761 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Received unexpected event network-vif-plugged-40abab10-895c-40a6-b87b-12fced1a5b22 for instance with vm_state active and task_state None.
Jan 21 18:38:11 compute-0 nova_compute[183278]: 2026-01-21 18:38:11.092 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:11 compute-0 nova_compute[183278]: 2026-01-21 18:38:11.213 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:16 compute-0 nova_compute[183278]: 2026-01-21 18:38:16.096 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:16 compute-0 nova_compute[183278]: 2026-01-21 18:38:16.215 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:18 compute-0 podman[212166]: 2026-01-21 18:38:18.009347441 +0000 UTC m=+0.063266150 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, version=9.6, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 18:38:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:20.108 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:38:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:20.109 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:38:20.110 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:21 compute-0 nova_compute[183278]: 2026-01-21 18:38:21.099 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:21 compute-0 nova_compute[183278]: 2026-01-21 18:38:21.217 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:21 compute-0 ovn_controller[95419]: 2026-01-21T18:38:21Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:66:ba:be 10.100.0.14
Jan 21 18:38:21 compute-0 ovn_controller[95419]: 2026-01-21T18:38:21Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:66:ba:be 10.100.0.14
Jan 21 18:38:24 compute-0 podman[212213]: 2026-01-21 18:38:24.997712948 +0000 UTC m=+0.051434994 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:38:25 compute-0 podman[212212]: 2026-01-21 18:38:25.017492786 +0000 UTC m=+0.076084530 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 21 18:38:25 compute-0 nova_compute[183278]: 2026-01-21 18:38:25.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:38:25 compute-0 nova_compute[183278]: 2026-01-21 18:38:25.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:38:25 compute-0 nova_compute[183278]: 2026-01-21 18:38:25.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:38:26 compute-0 nova_compute[183278]: 2026-01-21 18:38:26.133 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:26 compute-0 nova_compute[183278]: 2026-01-21 18:38:26.219 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:26 compute-0 nova_compute[183278]: 2026-01-21 18:38:26.910 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:38:26 compute-0 nova_compute[183278]: 2026-01-21 18:38:26.910 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:38:26 compute-0 nova_compute[183278]: 2026-01-21 18:38:26.911 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:38:26 compute-0 nova_compute[183278]: 2026-01-21 18:38:26.911 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid 20fd0a46-611d-4fcb-942c-11b291dfbaad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:38:29 compute-0 nova_compute[183278]: 2026-01-21 18:38:29.620 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Updating instance_info_cache with network_info: [{"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:38:29 compute-0 nova_compute[183278]: 2026-01-21 18:38:29.654 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:38:29 compute-0 nova_compute[183278]: 2026-01-21 18:38:29.654 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:38:29 compute-0 podman[192560]: time="2026-01-21T18:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:38:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:38:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Jan 21 18:38:29 compute-0 podman[212258]: 2026-01-21 18:38:29.998245023 +0000 UTC m=+0.053905625 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:38:30 compute-0 nova_compute[183278]: 2026-01-21 18:38:30.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:38:30 compute-0 nova_compute[183278]: 2026-01-21 18:38:30.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:38:30 compute-0 nova_compute[183278]: 2026-01-21 18:38:30.903 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:38:30 compute-0 nova_compute[183278]: 2026-01-21 18:38:30.904 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:30 compute-0 nova_compute[183278]: 2026-01-21 18:38:30.904 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:30 compute-0 nova_compute[183278]: 2026-01-21 18:38:30.905 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:38:30 compute-0 nova_compute[183278]: 2026-01-21 18:38:30.981 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.039 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.040 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.101 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.137 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.222 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.283 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.285 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5660MB free_disk=73.34963989257812GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.285 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.286 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.379 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance 20fd0a46-611d-4fcb-942c-11b291dfbaad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.380 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.380 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:38:31 compute-0 openstack_network_exporter[195402]: ERROR   18:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:38:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:38:31 compute-0 openstack_network_exporter[195402]: ERROR   18:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:38:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.424 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing inventories for resource provider 502e4243-611b-433d-a766-9b485d51652d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.466 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating ProviderTree inventory for provider 502e4243-611b-433d-a766-9b485d51652d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.467 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.484 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing aggregate associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.507 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing trait associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.544 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.560 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.586 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:38:31 compute-0 nova_compute[183278]: 2026-01-21 18:38:31.586 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:38:32 compute-0 nova_compute[183278]: 2026-01-21 18:38:32.581 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:38:32 compute-0 nova_compute[183278]: 2026-01-21 18:38:32.582 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:38:33 compute-0 nova_compute[183278]: 2026-01-21 18:38:33.815 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:38:33 compute-0 nova_compute[183278]: 2026-01-21 18:38:33.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:38:36 compute-0 nova_compute[183278]: 2026-01-21 18:38:36.141 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:36 compute-0 nova_compute[183278]: 2026-01-21 18:38:36.254 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:37 compute-0 nova_compute[183278]: 2026-01-21 18:38:37.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:38:39 compute-0 nova_compute[183278]: 2026-01-21 18:38:39.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:38:39 compute-0 nova_compute[183278]: 2026-01-21 18:38:39.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:38:41 compute-0 nova_compute[183278]: 2026-01-21 18:38:41.144 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:41 compute-0 nova_compute[183278]: 2026-01-21 18:38:41.257 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:46 compute-0 nova_compute[183278]: 2026-01-21 18:38:46.146 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:46 compute-0 nova_compute[183278]: 2026-01-21 18:38:46.259 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:48 compute-0 ovn_controller[95419]: 2026-01-21T18:38:48Z|00182|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Jan 21 18:38:49 compute-0 podman[212291]: 2026-01-21 18:38:49.021572584 +0000 UTC m=+0.080033835 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc.)
Jan 21 18:38:51 compute-0 nova_compute[183278]: 2026-01-21 18:38:51.148 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:51 compute-0 nova_compute[183278]: 2026-01-21 18:38:51.261 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:55 compute-0 podman[212315]: 2026-01-21 18:38:55.998253709 +0000 UTC m=+0.048392441 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:38:56 compute-0 podman[212314]: 2026-01-21 18:38:56.01943363 +0000 UTC m=+0.074291866 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:38:56 compute-0 nova_compute[183278]: 2026-01-21 18:38:56.150 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:56 compute-0 nova_compute[183278]: 2026-01-21 18:38:56.262 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:38:59 compute-0 podman[192560]: time="2026-01-21T18:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:38:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:38:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Jan 21 18:39:00 compute-0 podman[212358]: 2026-01-21 18:39:00.994315353 +0000 UTC m=+0.047883639 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:39:01 compute-0 nova_compute[183278]: 2026-01-21 18:39:01.154 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:01 compute-0 nova_compute[183278]: 2026-01-21 18:39:01.265 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:01 compute-0 openstack_network_exporter[195402]: ERROR   18:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:39:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:39:01 compute-0 openstack_network_exporter[195402]: ERROR   18:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:39:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:39:06 compute-0 nova_compute[183278]: 2026-01-21 18:39:06.157 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:06 compute-0 nova_compute[183278]: 2026-01-21 18:39:06.269 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:11 compute-0 nova_compute[183278]: 2026-01-21 18:39:11.160 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:11 compute-0 nova_compute[183278]: 2026-01-21 18:39:11.270 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:16 compute-0 nova_compute[183278]: 2026-01-21 18:39:16.195 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:16 compute-0 nova_compute[183278]: 2026-01-21 18:39:16.271 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:19 compute-0 podman[212387]: 2026-01-21 18:39:19.997392466 +0000 UTC m=+0.056373785 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container)
Jan 21 18:39:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:20.109 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:20.111 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:20.111 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:21 compute-0 nova_compute[183278]: 2026-01-21 18:39:21.198 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:21 compute-0 nova_compute[183278]: 2026-01-21 18:39:21.272 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:25 compute-0 nova_compute[183278]: 2026-01-21 18:39:25.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:25 compute-0 nova_compute[183278]: 2026-01-21 18:39:25.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:39:25 compute-0 nova_compute[183278]: 2026-01-21 18:39:25.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:39:26 compute-0 nova_compute[183278]: 2026-01-21 18:39:26.038 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:39:26 compute-0 nova_compute[183278]: 2026-01-21 18:39:26.038 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:39:26 compute-0 nova_compute[183278]: 2026-01-21 18:39:26.038 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:39:26 compute-0 nova_compute[183278]: 2026-01-21 18:39:26.038 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid 20fd0a46-611d-4fcb-942c-11b291dfbaad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:39:26 compute-0 nova_compute[183278]: 2026-01-21 18:39:26.201 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:26 compute-0 nova_compute[183278]: 2026-01-21 18:39:26.274 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:26 compute-0 nova_compute[183278]: 2026-01-21 18:39:26.953 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Updating instance_info_cache with network_info: [{"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:39:26 compute-0 nova_compute[183278]: 2026-01-21 18:39:26.967 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-20fd0a46-611d-4fcb-942c-11b291dfbaad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:39:26 compute-0 nova_compute[183278]: 2026-01-21 18:39:26.968 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:39:26 compute-0 podman[212409]: 2026-01-21 18:39:26.999793651 +0000 UTC m=+0.051996968 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 21 18:39:27 compute-0 podman[212408]: 2026-01-21 18:39:27.02622702 +0000 UTC m=+0.085968830 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:39:29 compute-0 podman[192560]: time="2026-01-21T18:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:39:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:39:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2642 "" "Go-http-client/1.1"
Jan 21 18:39:30 compute-0 nova_compute[183278]: 2026-01-21 18:39:30.030 183284 DEBUG nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Creating tmpfile /var/lib/nova/instances/tmpr5xws3fi to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 21 18:39:30 compute-0 nova_compute[183278]: 2026-01-21 18:39:30.032 183284 DEBUG nova.compute.manager [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr5xws3fi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 21 18:39:30 compute-0 nova_compute[183278]: 2026-01-21 18:39:30.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:31 compute-0 nova_compute[183278]: 2026-01-21 18:39:31.204 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:31 compute-0 nova_compute[183278]: 2026-01-21 18:39:31.276 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:31 compute-0 nova_compute[183278]: 2026-01-21 18:39:31.313 183284 DEBUG nova.compute.manager [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr5xws3fi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1086a30e-9d4f-4278-8d00-f4ce489f3c08',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 21 18:39:31 compute-0 nova_compute[183278]: 2026-01-21 18:39:31.338 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-1086a30e-9d4f-4278-8d00-f4ce489f3c08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:39:31 compute-0 nova_compute[183278]: 2026-01-21 18:39:31.338 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-1086a30e-9d4f-4278-8d00-f4ce489f3c08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:39:31 compute-0 nova_compute[183278]: 2026-01-21 18:39:31.338 183284 DEBUG nova.network.neutron [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:39:31 compute-0 openstack_network_exporter[195402]: ERROR   18:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:39:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:39:31 compute-0 openstack_network_exporter[195402]: ERROR   18:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:39:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:39:31 compute-0 nova_compute[183278]: 2026-01-21 18:39:31.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:31 compute-0 podman[212454]: 2026-01-21 18:39:31.996478363 +0000 UTC m=+0.052852819 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.803 183284 DEBUG nova.network.neutron [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Updating instance_info_cache with network_info: [{"id": "06b8ba3f-960c-4966-9687-d40021122824", "address": "fa:16:3e:76:4b:aa", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b8ba3f-96", "ovs_interfaceid": "06b8ba3f-960c-4966-9687-d40021122824", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.817 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-1086a30e-9d4f-4278-8d00-f4ce489f3c08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.819 183284 DEBUG nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr5xws3fi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1086a30e-9d4f-4278-8d00-f4ce489f3c08',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.819 183284 DEBUG nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Creating instance directory: /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.820 183284 DEBUG nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Creating disk.info with the contents: {'/var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk': 'qcow2', '/var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.820 183284 DEBUG nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.820 183284 DEBUG nova.objects.instance [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1086a30e-9d4f-4278-8d00-f4ce489f3c08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.846 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.847 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.847 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.847 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.848 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.908 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.909 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.910 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.921 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.962 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.980 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:39:32 compute-0 nova_compute[183278]: 2026-01-21 18:39:32.981 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.020 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.021 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.039 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.041 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.041 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.084 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.105 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.106 183284 DEBUG nova.virt.disk.api [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Checking if we can resize image /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.106 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.168 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.169 183284 DEBUG nova.virt.disk.api [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Cannot resize image /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.169 183284 DEBUG nova.objects.instance [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid 1086a30e-9d4f-4278-8d00-f4ce489f3c08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.187 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.211 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk.config 485376" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.214 183284 DEBUG nova.virt.libvirt.volume.remotefs [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk.config to /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.215 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk.config /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.345 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.347 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5692MB free_disk=73.3495979309082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.348 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.349 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.411 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Migration for instance 1086a30e-9d4f-4278-8d00-f4ce489f3c08 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.442 183284 INFO nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Updating resource usage from migration a5335559-d4fc-4c96-93f9-4a0dfe783611
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.443 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Starting to track incoming migration a5335559-d4fc-4c96-93f9-4a0dfe783611 with flavor 45095fe9-3fd5-4f1f-87b2-a2a8292135a2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.476 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance 20fd0a46-611d-4fcb-942c-11b291dfbaad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.494 183284 WARNING nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance 1086a30e-9d4f-4278-8d00-f4ce489f3c08 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.495 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.495 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.561 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.575 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.592 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.592 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.639 183284 DEBUG oslo_concurrency.processutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08/disk.config /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.641 183284 DEBUG nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.643 183284 DEBUG nova.virt.libvirt.vif [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:38:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-742717832',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-742717832',id=26,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:38:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='55c360f5fa1e445396df9c0ba67fb46d',ramdisk_id='',reservation_id='r-etorcggu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:38:26Z,user_data=None,user_id='61bd089d194f4cf380e1a5f0c92c9c62',uuid=1086a30e-9d4f-4278-8d00-f4ce489f3c08,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06b8ba3f-960c-4966-9687-d40021122824", "address": "fa:16:3e:76:4b:aa", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap06b8ba3f-96", "ovs_interfaceid": "06b8ba3f-960c-4966-9687-d40021122824", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.644 183284 DEBUG nova.network.os_vif_util [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "06b8ba3f-960c-4966-9687-d40021122824", "address": "fa:16:3e:76:4b:aa", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap06b8ba3f-96", "ovs_interfaceid": "06b8ba3f-960c-4966-9687-d40021122824", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.646 183284 DEBUG nova.network.os_vif_util [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:4b:aa,bridge_name='br-int',has_traffic_filtering=True,id=06b8ba3f-960c-4966-9687-d40021122824,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b8ba3f-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.648 183284 DEBUG os_vif [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:4b:aa,bridge_name='br-int',has_traffic_filtering=True,id=06b8ba3f-960c-4966-9687-d40021122824,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b8ba3f-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.649 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.650 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.650 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.652 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.653 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06b8ba3f-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.653 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06b8ba3f-96, col_values=(('external_ids', {'iface-id': '06b8ba3f-960c-4966-9687-d40021122824', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:4b:aa', 'vm-uuid': '1086a30e-9d4f-4278-8d00-f4ce489f3c08'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.655 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:33 compute-0 NetworkManager[55506]: <info>  [1769020773.6566] manager: (tap06b8ba3f-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.660 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.662 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.664 183284 INFO os_vif [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:4b:aa,bridge_name='br-int',has_traffic_filtering=True,id=06b8ba3f-960c-4966-9687-d40021122824,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b8ba3f-96')
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.664 183284 DEBUG nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 21 18:39:33 compute-0 nova_compute[183278]: 2026-01-21 18:39:33.664 183284 DEBUG nova.compute.manager [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr5xws3fi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1086a30e-9d4f-4278-8d00-f4ce489f3c08',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 21 18:39:34 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:34.237 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:39:34 compute-0 nova_compute[183278]: 2026-01-21 18:39:34.238 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:34 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:34.238 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:39:34 compute-0 nova_compute[183278]: 2026-01-21 18:39:34.514 183284 DEBUG nova.network.neutron [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Port 06b8ba3f-960c-4966-9687-d40021122824 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 21 18:39:34 compute-0 nova_compute[183278]: 2026-01-21 18:39:34.516 183284 DEBUG nova.compute.manager [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr5xws3fi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1086a30e-9d4f-4278-8d00-f4ce489f3c08',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 21 18:39:34 compute-0 nova_compute[183278]: 2026-01-21 18:39:34.591 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:34 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 21 18:39:34 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 21 18:39:34 compute-0 kernel: tap06b8ba3f-96: entered promiscuous mode
Jan 21 18:39:34 compute-0 NetworkManager[55506]: <info>  [1769020774.8158] manager: (tap06b8ba3f-96): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Jan 21 18:39:34 compute-0 ovn_controller[95419]: 2026-01-21T18:39:34Z|00183|binding|INFO|Claiming lport 06b8ba3f-960c-4966-9687-d40021122824 for this additional chassis.
Jan 21 18:39:34 compute-0 ovn_controller[95419]: 2026-01-21T18:39:34Z|00184|binding|INFO|06b8ba3f-960c-4966-9687-d40021122824: Claiming fa:16:3e:76:4b:aa 10.100.0.6
Jan 21 18:39:34 compute-0 nova_compute[183278]: 2026-01-21 18:39:34.817 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:34 compute-0 nova_compute[183278]: 2026-01-21 18:39:34.832 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:34 compute-0 nova_compute[183278]: 2026-01-21 18:39:34.835 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:34 compute-0 ovn_controller[95419]: 2026-01-21T18:39:34Z|00185|binding|INFO|Setting lport 06b8ba3f-960c-4966-9687-d40021122824 ovn-installed in OVS
Jan 21 18:39:34 compute-0 nova_compute[183278]: 2026-01-21 18:39:34.837 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:34 compute-0 systemd-udevd[212540]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:39:34 compute-0 systemd-machined[154592]: New machine qemu-18-instance-0000001a.
Jan 21 18:39:34 compute-0 NetworkManager[55506]: <info>  [1769020774.8619] device (tap06b8ba3f-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:39:34 compute-0 NetworkManager[55506]: <info>  [1769020774.8629] device (tap06b8ba3f-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:39:34 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-0000001a.
Jan 21 18:39:35 compute-0 nova_compute[183278]: 2026-01-21 18:39:35.185 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020775.1847975, 1086a30e-9d4f-4278-8d00-f4ce489f3c08 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:39:35 compute-0 nova_compute[183278]: 2026-01-21 18:39:35.185 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] VM Started (Lifecycle Event)
Jan 21 18:39:35 compute-0 nova_compute[183278]: 2026-01-21 18:39:35.214 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:39:35 compute-0 nova_compute[183278]: 2026-01-21 18:39:35.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:36 compute-0 nova_compute[183278]: 2026-01-21 18:39:36.229 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020776.2287674, 1086a30e-9d4f-4278-8d00-f4ce489f3c08 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:39:36 compute-0 nova_compute[183278]: 2026-01-21 18:39:36.229 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] VM Resumed (Lifecycle Event)
Jan 21 18:39:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:36.239 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:36 compute-0 nova_compute[183278]: 2026-01-21 18:39:36.258 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:39:36 compute-0 nova_compute[183278]: 2026-01-21 18:39:36.263 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:39:36 compute-0 nova_compute[183278]: 2026-01-21 18:39:36.278 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:36 compute-0 nova_compute[183278]: 2026-01-21 18:39:36.289 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 21 18:39:37 compute-0 ovn_controller[95419]: 2026-01-21T18:39:37Z|00186|binding|INFO|Claiming lport 06b8ba3f-960c-4966-9687-d40021122824 for this chassis.
Jan 21 18:39:37 compute-0 ovn_controller[95419]: 2026-01-21T18:39:37Z|00187|binding|INFO|06b8ba3f-960c-4966-9687-d40021122824: Claiming fa:16:3e:76:4b:aa 10.100.0.6
Jan 21 18:39:37 compute-0 ovn_controller[95419]: 2026-01-21T18:39:37Z|00188|binding|INFO|Setting lport 06b8ba3f-960c-4966-9687-d40021122824 up in Southbound
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.471 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:4b:aa 10.100.0.6'], port_security=['fa:16:3e:76:4b:aa 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1086a30e-9d4f-4278-8d00-f4ce489f3c08', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bdeafc54-899b-4c66-8810-1da93d23a91b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55c360f5fa1e445396df9c0ba67fb46d', 'neutron:revision_number': '11', 'neutron:security_group_ids': '2123a8b9-e62b-4182-97a1-e14b1b826b02', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcc99376-1fa1-4201-9a47-c665a9504862, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=06b8ba3f-960c-4966-9687-d40021122824) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.472 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 06b8ba3f-960c-4966-9687-d40021122824 in datapath bdeafc54-899b-4c66-8810-1da93d23a91b bound to our chassis
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.473 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bdeafc54-899b-4c66-8810-1da93d23a91b
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.487 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[0c594568-f2a1-42d1-ac86-2ba4e0d91d7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.517 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[866f6956-965a-4468-86d7-dbe49ac59d2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.519 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[98c4defa-ffe4-489f-a83d-184cb5b7630b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.547 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[983ceb85-1ed5-48d6-84d0-1fd0ddffa4d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.563 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[67ab51b1-548a-423a-a5ee-556827c71fd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbdeafc54-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:ad:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525086, 'reachable_time': 34558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212574, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.579 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[040abff4-acad-4e97-a911-2f740c991390]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapbdeafc54-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525098, 'tstamp': 525098}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212575, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbdeafc54-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525101, 'tstamp': 525101}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212575, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.581 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbdeafc54-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:37 compute-0 nova_compute[183278]: 2026-01-21 18:39:37.582 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:37 compute-0 nova_compute[183278]: 2026-01-21 18:39:37.583 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.583 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbdeafc54-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.583 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.583 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbdeafc54-80, col_values=(('external_ids', {'iface-id': '852717d2-9d24-4348-bdea-ddbd3580930b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:37.584 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:39:37 compute-0 nova_compute[183278]: 2026-01-21 18:39:37.627 183284 INFO nova.compute.manager [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Post operation of migration started
Jan 21 18:39:37 compute-0 nova_compute[183278]: 2026-01-21 18:39:37.951 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-1086a30e-9d4f-4278-8d00-f4ce489f3c08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:39:37 compute-0 nova_compute[183278]: 2026-01-21 18:39:37.952 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-1086a30e-9d4f-4278-8d00-f4ce489f3c08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:39:37 compute-0 nova_compute[183278]: 2026-01-21 18:39:37.952 183284 DEBUG nova.network.neutron [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:39:38 compute-0 nova_compute[183278]: 2026-01-21 18:39:38.656 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:39 compute-0 nova_compute[183278]: 2026-01-21 18:39:39.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:39 compute-0 nova_compute[183278]: 2026-01-21 18:39:39.848 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:40 compute-0 nova_compute[183278]: 2026-01-21 18:39:40.049 183284 DEBUG nova.network.neutron [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Updating instance_info_cache with network_info: [{"id": "06b8ba3f-960c-4966-9687-d40021122824", "address": "fa:16:3e:76:4b:aa", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b8ba3f-96", "ovs_interfaceid": "06b8ba3f-960c-4966-9687-d40021122824", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:39:40 compute-0 nova_compute[183278]: 2026-01-21 18:39:40.069 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-1086a30e-9d4f-4278-8d00-f4ce489f3c08" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:39:40 compute-0 nova_compute[183278]: 2026-01-21 18:39:40.086 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:40 compute-0 nova_compute[183278]: 2026-01-21 18:39:40.087 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:40 compute-0 nova_compute[183278]: 2026-01-21 18:39:40.087 183284 DEBUG oslo_concurrency.lockutils [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:40 compute-0 nova_compute[183278]: 2026-01-21 18:39:40.090 183284 INFO nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 21 18:39:40 compute-0 virtqemud[182681]: Domain id=18 name='instance-0000001a' uuid=1086a30e-9d4f-4278-8d00-f4ce489f3c08 is tainted: custom-monitor
Jan 21 18:39:41 compute-0 nova_compute[183278]: 2026-01-21 18:39:41.098 183284 INFO nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 21 18:39:41 compute-0 nova_compute[183278]: 2026-01-21 18:39:41.279 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:41 compute-0 nova_compute[183278]: 2026-01-21 18:39:41.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:39:41 compute-0 nova_compute[183278]: 2026-01-21 18:39:41.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:39:42 compute-0 nova_compute[183278]: 2026-01-21 18:39:42.104 183284 INFO nova.virt.libvirt.driver [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 21 18:39:42 compute-0 nova_compute[183278]: 2026-01-21 18:39:42.109 183284 DEBUG nova.compute.manager [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:39:42 compute-0 nova_compute[183278]: 2026-01-21 18:39:42.127 183284 DEBUG nova.objects.instance [None req-551fe554-a819-4922-8a48-0bb34cb4e15b 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 18:39:43 compute-0 nova_compute[183278]: 2026-01-21 18:39:43.658 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:46 compute-0 nova_compute[183278]: 2026-01-21 18:39:46.282 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.058 183284 DEBUG oslo_concurrency.lockutils [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.059 183284 DEBUG oslo_concurrency.lockutils [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.059 183284 DEBUG oslo_concurrency.lockutils [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.059 183284 DEBUG oslo_concurrency.lockutils [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.060 183284 DEBUG oslo_concurrency.lockutils [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.061 183284 INFO nova.compute.manager [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Terminating instance
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.061 183284 DEBUG nova.compute.manager [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:39:47 compute-0 kernel: tap06b8ba3f-96 (unregistering): left promiscuous mode
Jan 21 18:39:47 compute-0 NetworkManager[55506]: <info>  [1769020787.0849] device (tap06b8ba3f-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:39:47 compute-0 ovn_controller[95419]: 2026-01-21T18:39:47Z|00189|binding|INFO|Releasing lport 06b8ba3f-960c-4966-9687-d40021122824 from this chassis (sb_readonly=0)
Jan 21 18:39:47 compute-0 ovn_controller[95419]: 2026-01-21T18:39:47Z|00190|binding|INFO|Setting lport 06b8ba3f-960c-4966-9687-d40021122824 down in Southbound
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.093 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 ovn_controller[95419]: 2026-01-21T18:39:47Z|00191|binding|INFO|Removing iface tap06b8ba3f-96 ovn-installed in OVS
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.095 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.100 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:4b:aa 10.100.0.6'], port_security=['fa:16:3e:76:4b:aa 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1086a30e-9d4f-4278-8d00-f4ce489f3c08', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bdeafc54-899b-4c66-8810-1da93d23a91b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55c360f5fa1e445396df9c0ba67fb46d', 'neutron:revision_number': '13', 'neutron:security_group_ids': '2123a8b9-e62b-4182-97a1-e14b1b826b02', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcc99376-1fa1-4201-9a47-c665a9504862, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=06b8ba3f-960c-4966-9687-d40021122824) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.101 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 06b8ba3f-960c-4966-9687-d40021122824 in datapath bdeafc54-899b-4c66-8810-1da93d23a91b unbound from our chassis
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.102 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bdeafc54-899b-4c66-8810-1da93d23a91b
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.106 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.119 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb3e967-f3b9-41a1-886c-d7c58f7b688f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:47 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 21 18:39:47 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001a.scope: Consumed 1.228s CPU time.
Jan 21 18:39:47 compute-0 systemd-machined[154592]: Machine qemu-18-instance-0000001a terminated.
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.150 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[21f43967-a821-4256-91a6-187ab9bd9738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.153 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[cef90f5d-f26b-4262-a77d-44f2c96b3973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.179 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[78982c00-bca8-4ed4-b64e-7faa714cd1b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.195 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf2531b-8d52-4307-80e7-d82a3b1dbb8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbdeafc54-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:ad:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525086, 'reachable_time': 34558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212589, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.211 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e90108-9082-4443-8b5d-b370483ed080]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapbdeafc54-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525098, 'tstamp': 525098}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212590, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbdeafc54-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525101, 'tstamp': 525101}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212590, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.213 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbdeafc54-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.215 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.221 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.221 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbdeafc54-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.222 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.222 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbdeafc54-80, col_values=(('external_ids', {'iface-id': '852717d2-9d24-4348-bdea-ddbd3580930b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:47 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:47.222 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.278 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.282 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.319 183284 INFO nova.virt.libvirt.driver [-] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Instance destroyed successfully.
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.319 183284 DEBUG nova.objects.instance [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lazy-loading 'resources' on Instance uuid 1086a30e-9d4f-4278-8d00-f4ce489f3c08 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.335 183284 DEBUG nova.virt.libvirt.vif [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T18:38:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-742717832',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-742717832',id=26,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:38:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='55c360f5fa1e445396df9c0ba67fb46d',ramdisk_id='',reservation_id='r-etorcggu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:39:42Z,user_data=None,user_id='61bd089d194f4cf380e1a5f0c92c9c62',uuid=1086a30e-9d4f-4278-8d00-f4ce489f3c08,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06b8ba3f-960c-4966-9687-d40021122824", "address": "fa:16:3e:76:4b:aa", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b8ba3f-96", "ovs_interfaceid": "06b8ba3f-960c-4966-9687-d40021122824", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.336 183284 DEBUG nova.network.os_vif_util [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Converting VIF {"id": "06b8ba3f-960c-4966-9687-d40021122824", "address": "fa:16:3e:76:4b:aa", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06b8ba3f-96", "ovs_interfaceid": "06b8ba3f-960c-4966-9687-d40021122824", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.336 183284 DEBUG nova.network.os_vif_util [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:4b:aa,bridge_name='br-int',has_traffic_filtering=True,id=06b8ba3f-960c-4966-9687-d40021122824,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b8ba3f-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.337 183284 DEBUG os_vif [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:4b:aa,bridge_name='br-int',has_traffic_filtering=True,id=06b8ba3f-960c-4966-9687-d40021122824,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b8ba3f-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.338 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.339 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06b8ba3f-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.340 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.343 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.345 183284 INFO os_vif [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:4b:aa,bridge_name='br-int',has_traffic_filtering=True,id=06b8ba3f-960c-4966-9687-d40021122824,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06b8ba3f-96')
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.346 183284 INFO nova.virt.libvirt.driver [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Deleting instance files /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08_del
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.346 183284 INFO nova.virt.libvirt.driver [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Deletion of /var/lib/nova/instances/1086a30e-9d4f-4278-8d00-f4ce489f3c08_del complete
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.405 183284 INFO nova.compute.manager [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.406 183284 DEBUG oslo.service.loopingcall [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.406 183284 DEBUG nova.compute.manager [-] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.407 183284 DEBUG nova.network.neutron [-] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.557 183284 DEBUG nova.compute.manager [req-badaa9cd-8108-429f-9009-f867a2c3776d req-fea8af55-fb54-4a5d-85f9-4fbc4db54388 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Received event network-vif-unplugged-06b8ba3f-960c-4966-9687-d40021122824 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.558 183284 DEBUG oslo_concurrency.lockutils [req-badaa9cd-8108-429f-9009-f867a2c3776d req-fea8af55-fb54-4a5d-85f9-4fbc4db54388 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.558 183284 DEBUG oslo_concurrency.lockutils [req-badaa9cd-8108-429f-9009-f867a2c3776d req-fea8af55-fb54-4a5d-85f9-4fbc4db54388 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.558 183284 DEBUG oslo_concurrency.lockutils [req-badaa9cd-8108-429f-9009-f867a2c3776d req-fea8af55-fb54-4a5d-85f9-4fbc4db54388 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.559 183284 DEBUG nova.compute.manager [req-badaa9cd-8108-429f-9009-f867a2c3776d req-fea8af55-fb54-4a5d-85f9-4fbc4db54388 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] No waiting events found dispatching network-vif-unplugged-06b8ba3f-960c-4966-9687-d40021122824 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.559 183284 DEBUG nova.compute.manager [req-badaa9cd-8108-429f-9009-f867a2c3776d req-fea8af55-fb54-4a5d-85f9-4fbc4db54388 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Received event network-vif-unplugged-06b8ba3f-960c-4966-9687-d40021122824 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.945 183284 DEBUG nova.network.neutron [-] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:39:47 compute-0 nova_compute[183278]: 2026-01-21 18:39:47.963 183284 INFO nova.compute.manager [-] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Took 0.56 seconds to deallocate network for instance.
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.010 183284 DEBUG oslo_concurrency.lockutils [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.010 183284 DEBUG oslo_concurrency.lockutils [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.015 183284 DEBUG oslo_concurrency.lockutils [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.022 183284 DEBUG nova.compute.manager [req-08de9145-6902-48a3-9383-f5947de1cd8b req-ee1201f6-54ab-414b-a498-2993ffedc3bd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Received event network-vif-deleted-06b8ba3f-960c-4966-9687-d40021122824 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.055 183284 INFO nova.scheduler.client.report [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Deleted allocations for instance 1086a30e-9d4f-4278-8d00-f4ce489f3c08
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.124 183284 DEBUG oslo_concurrency.lockutils [None req-a4d72528-13f1-41b7-8289-22b1378614ed 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.887 183284 DEBUG oslo_concurrency.lockutils [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "20fd0a46-611d-4fcb-942c-11b291dfbaad" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.887 183284 DEBUG oslo_concurrency.lockutils [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.888 183284 DEBUG oslo_concurrency.lockutils [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.888 183284 DEBUG oslo_concurrency.lockutils [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.888 183284 DEBUG oslo_concurrency.lockutils [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.890 183284 INFO nova.compute.manager [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Terminating instance
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.891 183284 DEBUG nova.compute.manager [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 18:39:48 compute-0 kernel: tap40abab10-89 (unregistering): left promiscuous mode
Jan 21 18:39:48 compute-0 NetworkManager[55506]: <info>  [1769020788.9101] device (tap40abab10-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:39:48 compute-0 ovn_controller[95419]: 2026-01-21T18:39:48Z|00192|binding|INFO|Releasing lport 40abab10-895c-40a6-b87b-12fced1a5b22 from this chassis (sb_readonly=0)
Jan 21 18:39:48 compute-0 ovn_controller[95419]: 2026-01-21T18:39:48Z|00193|binding|INFO|Setting lport 40abab10-895c-40a6-b87b-12fced1a5b22 down in Southbound
Jan 21 18:39:48 compute-0 ovn_controller[95419]: 2026-01-21T18:39:48Z|00194|binding|INFO|Removing iface tap40abab10-89 ovn-installed in OVS
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.913 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.914 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:48 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:48.921 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:ba:be 10.100.0.14'], port_security=['fa:16:3e:66:ba:be 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '20fd0a46-611d-4fcb-942c-11b291dfbaad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bdeafc54-899b-4c66-8810-1da93d23a91b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55c360f5fa1e445396df9c0ba67fb46d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2123a8b9-e62b-4182-97a1-e14b1b826b02', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcc99376-1fa1-4201-9a47-c665a9504862, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=40abab10-895c-40a6-b87b-12fced1a5b22) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:39:48 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:48.922 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 40abab10-895c-40a6-b87b-12fced1a5b22 in datapath bdeafc54-899b-4c66-8810-1da93d23a91b unbound from our chassis
Jan 21 18:39:48 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:48.923 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bdeafc54-899b-4c66-8810-1da93d23a91b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:39:48 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:48.924 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2433c97f-d6a7-40be-9316-6946f0274def]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:48 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:48.925 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b namespace which is not needed anymore
Jan 21 18:39:48 compute-0 nova_compute[183278]: 2026-01-21 18:39:48.932 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:48 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 21 18:39:48 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000019.scope: Consumed 16.294s CPU time.
Jan 21 18:39:48 compute-0 systemd-machined[154592]: Machine qemu-17-instance-00000019 terminated.
Jan 21 18:39:49 compute-0 neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b[212144]: [NOTICE]   (212148) : haproxy version is 2.8.14-c23fe91
Jan 21 18:39:49 compute-0 neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b[212144]: [NOTICE]   (212148) : path to executable is /usr/sbin/haproxy
Jan 21 18:39:49 compute-0 neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b[212144]: [WARNING]  (212148) : Exiting Master process...
Jan 21 18:39:49 compute-0 neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b[212144]: [ALERT]    (212148) : Current worker (212150) exited with code 143 (Terminated)
Jan 21 18:39:49 compute-0 neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b[212144]: [WARNING]  (212148) : All workers exited. Exiting... (0)
Jan 21 18:39:49 compute-0 systemd[1]: libpod-9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28.scope: Deactivated successfully.
Jan 21 18:39:49 compute-0 podman[212632]: 2026-01-21 18:39:49.069853374 +0000 UTC m=+0.052068710 container died 9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 21 18:39:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28-userdata-shm.mount: Deactivated successfully.
Jan 21 18:39:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-5d0b5a8b9751a0b473d94ed82df4fda028c4684a1f48258c8899ea2a681a1b8a-merged.mount: Deactivated successfully.
Jan 21 18:39:49 compute-0 podman[212632]: 2026-01-21 18:39:49.10937523 +0000 UTC m=+0.091590566 container cleanup 9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 18:39:49 compute-0 NetworkManager[55506]: <info>  [1769020789.1100] manager: (tap40abab10-89): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.111 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.115 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:49 compute-0 systemd[1]: libpod-conmon-9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28.scope: Deactivated successfully.
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.149 183284 INFO nova.virt.libvirt.driver [-] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Instance destroyed successfully.
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.150 183284 DEBUG nova.objects.instance [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lazy-loading 'resources' on Instance uuid 20fd0a46-611d-4fcb-942c-11b291dfbaad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.162 183284 DEBUG nova.virt.libvirt.vif [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:37:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1706326110',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1706326110',id=25,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:38:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='55c360f5fa1e445396df9c0ba67fb46d',ramdisk_id='',reservation_id='r-wtrayq0d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-145388266-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:38:08Z,user_data=None,user_id='61bd089d194f4cf380e1a5f0c92c9c62',uuid=20fd0a46-611d-4fcb-942c-11b291dfbaad,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.163 183284 DEBUG nova.network.os_vif_util [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Converting VIF {"id": "40abab10-895c-40a6-b87b-12fced1a5b22", "address": "fa:16:3e:66:ba:be", "network": {"id": "bdeafc54-899b-4c66-8810-1da93d23a91b", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-864084064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55c360f5fa1e445396df9c0ba67fb46d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40abab10-89", "ovs_interfaceid": "40abab10-895c-40a6-b87b-12fced1a5b22", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.164 183284 DEBUG nova.network.os_vif_util [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:ba:be,bridge_name='br-int',has_traffic_filtering=True,id=40abab10-895c-40a6-b87b-12fced1a5b22,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40abab10-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.164 183284 DEBUG os_vif [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:ba:be,bridge_name='br-int',has_traffic_filtering=True,id=40abab10-895c-40a6-b87b-12fced1a5b22,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40abab10-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.165 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.166 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40abab10-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.168 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.171 183284 INFO os_vif [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:ba:be,bridge_name='br-int',has_traffic_filtering=True,id=40abab10-895c-40a6-b87b-12fced1a5b22,network=Network(bdeafc54-899b-4c66-8810-1da93d23a91b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40abab10-89')
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.172 183284 INFO nova.virt.libvirt.driver [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Deleting instance files /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad_del
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.172 183284 INFO nova.virt.libvirt.driver [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Deletion of /var/lib/nova/instances/20fd0a46-611d-4fcb-942c-11b291dfbaad_del complete
Jan 21 18:39:49 compute-0 podman[212668]: 2026-01-21 18:39:49.174070763 +0000 UTC m=+0.043011241 container remove 9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 18:39:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:49.179 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b8446f0b-daf4-43c7-8d15-2894939585b5]: (4, ('Wed Jan 21 06:39:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b (9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28)\n9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28\nWed Jan 21 06:39:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b (9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28)\n9f6aa24469285926243a84968f9eeb1e2d5da0305e05d87e4a6735a97a6d4d28\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:49.182 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[781f50d5-3299-45a6-9f22-f736cf070dc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:49.183 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbdeafc54-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.185 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:49 compute-0 kernel: tapbdeafc54-80: left promiscuous mode
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.197 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:49.200 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e61c2b7f-1686-4a70-9720-1f74a18b1e97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:49.215 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[fefa49e1-c4d7-457a-8d0c-f4628898f56d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:49.216 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[560d2d35-c37e-4b04-b7ed-6ebb6a5da97f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.217 183284 INFO nova.compute.manager [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.217 183284 DEBUG oslo.service.loopingcall [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.217 183284 DEBUG nova.compute.manager [-] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.218 183284 DEBUG nova.network.neutron [-] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 18:39:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:49.232 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f218f580-86e8-4aad-ac5a-1f5899f9f9c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525080, 'reachable_time': 29267, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212687, 'error': None, 'target': 'ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:49.235 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bdeafc54-899b-4c66-8810-1da93d23a91b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:39:49 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:39:49.235 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[d6f01ea2-3e56-4d9f-8809-627b1416768c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:39:49 compute-0 systemd[1]: run-netns-ovnmeta\x2dbdeafc54\x2d899b\x2d4c66\x2d8810\x2d1da93d23a91b.mount: Deactivated successfully.
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.662 183284 DEBUG nova.compute.manager [req-1f4b83b1-4d5c-493f-b528-f05dab39ee25 req-4d1d817f-3514-49bd-8f99-30fc2eae3f1a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Received event network-vif-plugged-06b8ba3f-960c-4966-9687-d40021122824 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.663 183284 DEBUG oslo_concurrency.lockutils [req-1f4b83b1-4d5c-493f-b528-f05dab39ee25 req-4d1d817f-3514-49bd-8f99-30fc2eae3f1a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.663 183284 DEBUG oslo_concurrency.lockutils [req-1f4b83b1-4d5c-493f-b528-f05dab39ee25 req-4d1d817f-3514-49bd-8f99-30fc2eae3f1a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.663 183284 DEBUG oslo_concurrency.lockutils [req-1f4b83b1-4d5c-493f-b528-f05dab39ee25 req-4d1d817f-3514-49bd-8f99-30fc2eae3f1a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "1086a30e-9d4f-4278-8d00-f4ce489f3c08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.663 183284 DEBUG nova.compute.manager [req-1f4b83b1-4d5c-493f-b528-f05dab39ee25 req-4d1d817f-3514-49bd-8f99-30fc2eae3f1a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] No waiting events found dispatching network-vif-plugged-06b8ba3f-960c-4966-9687-d40021122824 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:39:49 compute-0 nova_compute[183278]: 2026-01-21 18:39:49.664 183284 WARNING nova.compute.manager [req-1f4b83b1-4d5c-493f-b528-f05dab39ee25 req-4d1d817f-3514-49bd-8f99-30fc2eae3f1a 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Received unexpected event network-vif-plugged-06b8ba3f-960c-4966-9687-d40021122824 for instance with vm_state deleted and task_state None.
Jan 21 18:39:51 compute-0 nova_compute[183278]: 2026-01-21 18:39:50.999 183284 DEBUG nova.compute.manager [req-f770a3ca-2c14-4038-b0b4-b4bd641bb77c req-d1f6fff8-1903-44e7-a02b-9682e326b20f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Received event network-vif-unplugged-40abab10-895c-40a6-b87b-12fced1a5b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:39:51 compute-0 nova_compute[183278]: 2026-01-21 18:39:51.000 183284 DEBUG oslo_concurrency.lockutils [req-f770a3ca-2c14-4038-b0b4-b4bd641bb77c req-d1f6fff8-1903-44e7-a02b-9682e326b20f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:51 compute-0 nova_compute[183278]: 2026-01-21 18:39:51.000 183284 DEBUG oslo_concurrency.lockutils [req-f770a3ca-2c14-4038-b0b4-b4bd641bb77c req-d1f6fff8-1903-44e7-a02b-9682e326b20f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:51 compute-0 nova_compute[183278]: 2026-01-21 18:39:51.000 183284 DEBUG oslo_concurrency.lockutils [req-f770a3ca-2c14-4038-b0b4-b4bd641bb77c req-d1f6fff8-1903-44e7-a02b-9682e326b20f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:51 compute-0 nova_compute[183278]: 2026-01-21 18:39:51.001 183284 DEBUG nova.compute.manager [req-f770a3ca-2c14-4038-b0b4-b4bd641bb77c req-d1f6fff8-1903-44e7-a02b-9682e326b20f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] No waiting events found dispatching network-vif-unplugged-40abab10-895c-40a6-b87b-12fced1a5b22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:39:51 compute-0 nova_compute[183278]: 2026-01-21 18:39:51.001 183284 DEBUG nova.compute.manager [req-f770a3ca-2c14-4038-b0b4-b4bd641bb77c req-d1f6fff8-1903-44e7-a02b-9682e326b20f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Received event network-vif-unplugged-40abab10-895c-40a6-b87b-12fced1a5b22 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:39:51 compute-0 podman[212688]: 2026-01-21 18:39:51.00275752 +0000 UTC m=+0.056786813 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Jan 21 18:39:51 compute-0 nova_compute[183278]: 2026-01-21 18:39:51.283 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:52 compute-0 nova_compute[183278]: 2026-01-21 18:39:52.182 183284 DEBUG nova.network.neutron [-] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:39:52 compute-0 nova_compute[183278]: 2026-01-21 18:39:52.199 183284 INFO nova.compute.manager [-] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Took 2.98 seconds to deallocate network for instance.
Jan 21 18:39:52 compute-0 nova_compute[183278]: 2026-01-21 18:39:52.240 183284 DEBUG oslo_concurrency.lockutils [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:52 compute-0 nova_compute[183278]: 2026-01-21 18:39:52.241 183284 DEBUG oslo_concurrency.lockutils [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:52 compute-0 nova_compute[183278]: 2026-01-21 18:39:52.299 183284 DEBUG nova.compute.provider_tree [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:39:52 compute-0 nova_compute[183278]: 2026-01-21 18:39:52.315 183284 DEBUG nova.scheduler.client.report [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:39:52 compute-0 nova_compute[183278]: 2026-01-21 18:39:52.339 183284 DEBUG oslo_concurrency.lockutils [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:52 compute-0 nova_compute[183278]: 2026-01-21 18:39:52.369 183284 INFO nova.scheduler.client.report [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Deleted allocations for instance 20fd0a46-611d-4fcb-942c-11b291dfbaad
Jan 21 18:39:52 compute-0 nova_compute[183278]: 2026-01-21 18:39:52.443 183284 DEBUG oslo_concurrency.lockutils [None req-d84ea561-2caa-42d9-84a3-4f700228ccc7 61bd089d194f4cf380e1a5f0c92c9c62 55c360f5fa1e445396df9c0ba67fb46d - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:53 compute-0 nova_compute[183278]: 2026-01-21 18:39:53.084 183284 DEBUG nova.compute.manager [req-f1864aeb-cf1f-47f7-95ef-041dd26e5ed5 req-298f3f22-7b93-4340-9ca2-0f87a191b733 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Received event network-vif-plugged-40abab10-895c-40a6-b87b-12fced1a5b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:39:53 compute-0 nova_compute[183278]: 2026-01-21 18:39:53.084 183284 DEBUG oslo_concurrency.lockutils [req-f1864aeb-cf1f-47f7-95ef-041dd26e5ed5 req-298f3f22-7b93-4340-9ca2-0f87a191b733 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:39:53 compute-0 nova_compute[183278]: 2026-01-21 18:39:53.085 183284 DEBUG oslo_concurrency.lockutils [req-f1864aeb-cf1f-47f7-95ef-041dd26e5ed5 req-298f3f22-7b93-4340-9ca2-0f87a191b733 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:39:53 compute-0 nova_compute[183278]: 2026-01-21 18:39:53.085 183284 DEBUG oslo_concurrency.lockutils [req-f1864aeb-cf1f-47f7-95ef-041dd26e5ed5 req-298f3f22-7b93-4340-9ca2-0f87a191b733 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "20fd0a46-611d-4fcb-942c-11b291dfbaad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:39:53 compute-0 nova_compute[183278]: 2026-01-21 18:39:53.086 183284 DEBUG nova.compute.manager [req-f1864aeb-cf1f-47f7-95ef-041dd26e5ed5 req-298f3f22-7b93-4340-9ca2-0f87a191b733 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] No waiting events found dispatching network-vif-plugged-40abab10-895c-40a6-b87b-12fced1a5b22 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:39:53 compute-0 nova_compute[183278]: 2026-01-21 18:39:53.087 183284 WARNING nova.compute.manager [req-f1864aeb-cf1f-47f7-95ef-041dd26e5ed5 req-298f3f22-7b93-4340-9ca2-0f87a191b733 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Received unexpected event network-vif-plugged-40abab10-895c-40a6-b87b-12fced1a5b22 for instance with vm_state deleted and task_state None.
Jan 21 18:39:53 compute-0 nova_compute[183278]: 2026-01-21 18:39:53.087 183284 DEBUG nova.compute.manager [req-f1864aeb-cf1f-47f7-95ef-041dd26e5ed5 req-298f3f22-7b93-4340-9ca2-0f87a191b733 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Received event network-vif-deleted-40abab10-895c-40a6-b87b-12fced1a5b22 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:39:54 compute-0 nova_compute[183278]: 2026-01-21 18:39:54.198 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:56 compute-0 nova_compute[183278]: 2026-01-21 18:39:56.327 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:57 compute-0 podman[212711]: 2026-01-21 18:39:57.989399387 +0000 UTC m=+0.046277630 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 18:39:58 compute-0 podman[212710]: 2026-01-21 18:39:58.019775541 +0000 UTC m=+0.077848363 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 21 18:39:59 compute-0 nova_compute[183278]: 2026-01-21 18:39:59.200 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:39:59 compute-0 podman[192560]: time="2026-01-21T18:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:39:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:39:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 21 18:40:01 compute-0 nova_compute[183278]: 2026-01-21 18:40:01.368 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:01 compute-0 openstack_network_exporter[195402]: ERROR   18:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:40:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:40:01 compute-0 openstack_network_exporter[195402]: ERROR   18:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:40:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:40:02 compute-0 nova_compute[183278]: 2026-01-21 18:40:02.317 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020787.3164377, 1086a30e-9d4f-4278-8d00-f4ce489f3c08 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:40:02 compute-0 nova_compute[183278]: 2026-01-21 18:40:02.318 183284 INFO nova.compute.manager [-] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] VM Stopped (Lifecycle Event)
Jan 21 18:40:02 compute-0 nova_compute[183278]: 2026-01-21 18:40:02.349 183284 DEBUG nova.compute.manager [None req-c0d56ed1-b99c-4c9b-9249-1ec97f6b9c54 - - - - - -] [instance: 1086a30e-9d4f-4278-8d00-f4ce489f3c08] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:40:02 compute-0 podman[212757]: 2026-01-21 18:40:02.987131332 +0000 UTC m=+0.046304630 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:40:04 compute-0 nova_compute[183278]: 2026-01-21 18:40:04.148 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020789.1462169, 20fd0a46-611d-4fcb-942c-11b291dfbaad => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:40:04 compute-0 nova_compute[183278]: 2026-01-21 18:40:04.148 183284 INFO nova.compute.manager [-] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] VM Stopped (Lifecycle Event)
Jan 21 18:40:04 compute-0 nova_compute[183278]: 2026-01-21 18:40:04.165 183284 DEBUG nova.compute.manager [None req-b7a242e9-7d35-4fd1-8c08-f8a4506b5d78 - - - - - -] [instance: 20fd0a46-611d-4fcb-942c-11b291dfbaad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:40:04 compute-0 nova_compute[183278]: 2026-01-21 18:40:04.200 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:06 compute-0 nova_compute[183278]: 2026-01-21 18:40:06.370 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:09 compute-0 nova_compute[183278]: 2026-01-21 18:40:09.202 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:11 compute-0 nova_compute[183278]: 2026-01-21 18:40:11.406 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:13 compute-0 sshd-session[212782]: Invalid user user from 64.227.98.100 port 41548
Jan 21 18:40:13 compute-0 sshd-session[212782]: Connection closed by invalid user user 64.227.98.100 port 41548 [preauth]
Jan 21 18:40:14 compute-0 nova_compute[183278]: 2026-01-21 18:40:14.204 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:16 compute-0 nova_compute[183278]: 2026-01-21 18:40:16.407 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:19 compute-0 nova_compute[183278]: 2026-01-21 18:40:19.242 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:40:20.111 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:40:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:40:20.112 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:40:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:40:20.112 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:40:21 compute-0 nova_compute[183278]: 2026-01-21 18:40:21.456 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:21 compute-0 podman[212784]: 2026-01-21 18:40:21.993250347 +0000 UTC m=+0.052402687 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, config_id=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:40:22 compute-0 ovn_controller[95419]: 2026-01-21T18:40:22Z|00195|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 21 18:40:24 compute-0 nova_compute[183278]: 2026-01-21 18:40:24.244 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:26 compute-0 nova_compute[183278]: 2026-01-21 18:40:26.457 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:27 compute-0 nova_compute[183278]: 2026-01-21 18:40:27.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:27 compute-0 nova_compute[183278]: 2026-01-21 18:40:27.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:40:27 compute-0 nova_compute[183278]: 2026-01-21 18:40:27.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:40:27 compute-0 nova_compute[183278]: 2026-01-21 18:40:27.861 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:40:28 compute-0 nova_compute[183278]: 2026-01-21 18:40:28.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:28 compute-0 nova_compute[183278]: 2026-01-21 18:40:28.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 18:40:29 compute-0 podman[212806]: 2026-01-21 18:40:29.005349419 +0000 UTC m=+0.056457096 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Jan 21 18:40:29 compute-0 podman[212805]: 2026-01-21 18:40:29.037390093 +0000 UTC m=+0.088481939 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 18:40:29 compute-0 nova_compute[183278]: 2026-01-21 18:40:29.245 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:29 compute-0 podman[192560]: time="2026-01-21T18:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:40:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:40:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2183 "" "Go-http-client/1.1"
Jan 21 18:40:30 compute-0 nova_compute[183278]: 2026-01-21 18:40:30.095 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:30 compute-0 nova_compute[183278]: 2026-01-21 18:40:30.833 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:31 compute-0 openstack_network_exporter[195402]: ERROR   18:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:40:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:40:31 compute-0 openstack_network_exporter[195402]: ERROR   18:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:40:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:40:31 compute-0 nova_compute[183278]: 2026-01-21 18:40:31.458 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:31 compute-0 nova_compute[183278]: 2026-01-21 18:40:31.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:32 compute-0 nova_compute[183278]: 2026-01-21 18:40:32.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:32 compute-0 nova_compute[183278]: 2026-01-21 18:40:32.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:32 compute-0 nova_compute[183278]: 2026-01-21 18:40:32.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 18:40:32 compute-0 nova_compute[183278]: 2026-01-21 18:40:32.833 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 18:40:33 compute-0 nova_compute[183278]: 2026-01-21 18:40:33.833 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:34 compute-0 podman[212851]: 2026-01-21 18:40:34.001606759 +0000 UTC m=+0.047797217 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:40:34 compute-0 nova_compute[183278]: 2026-01-21 18:40:34.247 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:34 compute-0 nova_compute[183278]: 2026-01-21 18:40:34.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:36 compute-0 nova_compute[183278]: 2026-01-21 18:40:36.461 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:36 compute-0 nova_compute[183278]: 2026-01-21 18:40:36.611 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:40:36 compute-0 nova_compute[183278]: 2026-01-21 18:40:36.612 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:40:36 compute-0 nova_compute[183278]: 2026-01-21 18:40:36.612 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:40:36 compute-0 nova_compute[183278]: 2026-01-21 18:40:36.612 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:40:36 compute-0 nova_compute[183278]: 2026-01-21 18:40:36.754 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:40:36 compute-0 nova_compute[183278]: 2026-01-21 18:40:36.756 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.3788833618164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:40:36 compute-0 nova_compute[183278]: 2026-01-21 18:40:36.756 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:40:36 compute-0 nova_compute[183278]: 2026-01-21 18:40:36.756 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:40:37 compute-0 nova_compute[183278]: 2026-01-21 18:40:37.937 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:40:37 compute-0 nova_compute[183278]: 2026-01-21 18:40:37.937 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:40:37 compute-0 nova_compute[183278]: 2026-01-21 18:40:37.965 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:40:38 compute-0 nova_compute[183278]: 2026-01-21 18:40:38.156 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:40:38 compute-0 nova_compute[183278]: 2026-01-21 18:40:38.339 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:40:38 compute-0 nova_compute[183278]: 2026-01-21 18:40:38.340 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:40:39 compute-0 nova_compute[183278]: 2026-01-21 18:40:39.089 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:40:39.088 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:40:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:40:39.089 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:40:39 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:40:39.090 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:40:39 compute-0 nova_compute[183278]: 2026-01-21 18:40:39.250 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:41 compute-0 nova_compute[183278]: 2026-01-21 18:40:41.340 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:41 compute-0 nova_compute[183278]: 2026-01-21 18:40:41.341 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:41 compute-0 nova_compute[183278]: 2026-01-21 18:40:41.463 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:42 compute-0 nova_compute[183278]: 2026-01-21 18:40:42.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:42 compute-0 nova_compute[183278]: 2026-01-21 18:40:42.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:40:44 compute-0 nova_compute[183278]: 2026-01-21 18:40:44.251 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:46 compute-0 nova_compute[183278]: 2026-01-21 18:40:46.469 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:49 compute-0 nova_compute[183278]: 2026-01-21 18:40:49.259 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:51 compute-0 nova_compute[183278]: 2026-01-21 18:40:51.471 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:53 compute-0 podman[212877]: 2026-01-21 18:40:53.004931439 +0000 UTC m=+0.060571105 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Jan 21 18:40:54 compute-0 nova_compute[183278]: 2026-01-21 18:40:54.262 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:54 compute-0 nova_compute[183278]: 2026-01-21 18:40:54.819 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:40:56 compute-0 nova_compute[183278]: 2026-01-21 18:40:56.511 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:59 compute-0 nova_compute[183278]: 2026-01-21 18:40:59.265 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:40:59 compute-0 podman[192560]: time="2026-01-21T18:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:40:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:40:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 21 18:40:59 compute-0 podman[212900]: 2026-01-21 18:40:59.993179204 +0000 UTC m=+0.048263398 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:41:00 compute-0 podman[212899]: 2026-01-21 18:41:00.018112617 +0000 UTC m=+0.075062657 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 21 18:41:01 compute-0 openstack_network_exporter[195402]: ERROR   18:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:41:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:41:01 compute-0 openstack_network_exporter[195402]: ERROR   18:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:41:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:41:01 compute-0 nova_compute[183278]: 2026-01-21 18:41:01.513 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:04 compute-0 nova_compute[183278]: 2026-01-21 18:41:04.267 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:04 compute-0 podman[212944]: 2026-01-21 18:41:04.991454553 +0000 UTC m=+0.047527080 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:41:06 compute-0 nova_compute[183278]: 2026-01-21 18:41:06.514 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:09 compute-0 nova_compute[183278]: 2026-01-21 18:41:09.268 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:11 compute-0 nova_compute[183278]: 2026-01-21 18:41:11.555 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:12 compute-0 ovn_controller[95419]: 2026-01-21T18:41:12Z|00196|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 18:41:14 compute-0 nova_compute[183278]: 2026-01-21 18:41:14.270 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:16 compute-0 nova_compute[183278]: 2026-01-21 18:41:16.604 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:19 compute-0 nova_compute[183278]: 2026-01-21 18:41:19.271 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:20.113 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:20.113 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:20.113 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:21 compute-0 nova_compute[183278]: 2026-01-21 18:41:21.607 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:23 compute-0 podman[212969]: 2026-01-21 18:41:23.996463134 +0000 UTC m=+0.048757809 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Jan 21 18:41:24 compute-0 nova_compute[183278]: 2026-01-21 18:41:24.271 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:26 compute-0 nova_compute[183278]: 2026-01-21 18:41:26.610 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:27 compute-0 nova_compute[183278]: 2026-01-21 18:41:27.829 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:27 compute-0 nova_compute[183278]: 2026-01-21 18:41:27.830 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:41:27 compute-0 nova_compute[183278]: 2026-01-21 18:41:27.830 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:41:27 compute-0 nova_compute[183278]: 2026-01-21 18:41:27.846 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:41:29 compute-0 nova_compute[183278]: 2026-01-21 18:41:29.273 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:29 compute-0 podman[192560]: time="2026-01-21T18:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:41:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:41:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 21 18:41:30 compute-0 podman[212990]: 2026-01-21 18:41:30.996459574 +0000 UTC m=+0.047267164 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 18:41:31 compute-0 podman[212989]: 2026-01-21 18:41:31.020194648 +0000 UTC m=+0.076877630 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:41:31 compute-0 openstack_network_exporter[195402]: ERROR   18:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:41:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:41:31 compute-0 openstack_network_exporter[195402]: ERROR   18:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:41:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:41:31 compute-0 nova_compute[183278]: 2026-01-21 18:41:31.657 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:32 compute-0 nova_compute[183278]: 2026-01-21 18:41:32.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:32 compute-0 nova_compute[183278]: 2026-01-21 18:41:32.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:32 compute-0 nova_compute[183278]: 2026-01-21 18:41:32.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.137 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.137 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.157 183284 DEBUG nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.280 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.280 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.286 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.287 183284 INFO nova.compute.claims [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.410 183284 DEBUG nova.compute.provider_tree [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.426 183284 DEBUG nova.scheduler.client.report [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.444 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.444 183284 DEBUG nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.489 183284 DEBUG nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.489 183284 DEBUG nova.network.neutron [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.511 183284 INFO nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.531 183284 DEBUG nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.622 183284 DEBUG nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.623 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.623 183284 INFO nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Creating image(s)
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.624 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Acquiring lock "/var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.624 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "/var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.625 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "/var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.636 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.691 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.693 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.693 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.704 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.755 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.756 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.953 183284 DEBUG nova.policy [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '02dcb8b770104e6fbacfd9aced0763ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e24c6330d8214c038817e159aa32ee75', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.992 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk 1073741824" returned: 0 in 0.236s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.993 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:33 compute-0 nova_compute[183278]: 2026-01-21 18:41:33.993 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.049 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.051 183284 DEBUG nova.virt.disk.api [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Checking if we can resize image /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.051 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.104 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.105 183284 DEBUG nova.virt.disk.api [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Cannot resize image /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.105 183284 DEBUG nova.objects.instance [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lazy-loading 'migration_context' on Instance uuid b53935fe-61d0-4662-9242-b4afae882b6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.124 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.124 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Ensure instance console log exists: /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.125 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.125 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.125 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.274 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:34 compute-0 nova_compute[183278]: 2026-01-21 18:41:34.517 183284 DEBUG nova.network.neutron [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Successfully created port: 31eaba9d-250a-4709-9d55-cc3e54eb722d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:41:35 compute-0 nova_compute[183278]: 2026-01-21 18:41:35.633 183284 DEBUG nova.network.neutron [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Successfully updated port: 31eaba9d-250a-4709-9d55-cc3e54eb722d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:41:35 compute-0 nova_compute[183278]: 2026-01-21 18:41:35.657 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Acquiring lock "refresh_cache-b53935fe-61d0-4662-9242-b4afae882b6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:41:35 compute-0 nova_compute[183278]: 2026-01-21 18:41:35.657 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Acquired lock "refresh_cache-b53935fe-61d0-4662-9242-b4afae882b6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:41:35 compute-0 nova_compute[183278]: 2026-01-21 18:41:35.658 183284 DEBUG nova.network.neutron [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:41:35 compute-0 nova_compute[183278]: 2026-01-21 18:41:35.738 183284 DEBUG nova.compute.manager [req-b3091527-f72f-4bfb-89d1-14ef79ec2786 req-dd314fdb-0994-43af-a5ad-f4b6c0eedb17 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-changed-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:35 compute-0 nova_compute[183278]: 2026-01-21 18:41:35.738 183284 DEBUG nova.compute.manager [req-b3091527-f72f-4bfb-89d1-14ef79ec2786 req-dd314fdb-0994-43af-a5ad-f4b6c0eedb17 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Refreshing instance network info cache due to event network-changed-31eaba9d-250a-4709-9d55-cc3e54eb722d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:41:35 compute-0 nova_compute[183278]: 2026-01-21 18:41:35.738 183284 DEBUG oslo_concurrency.lockutils [req-b3091527-f72f-4bfb-89d1-14ef79ec2786 req-dd314fdb-0994-43af-a5ad-f4b6c0eedb17 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-b53935fe-61d0-4662-9242-b4afae882b6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:41:35 compute-0 nova_compute[183278]: 2026-01-21 18:41:35.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:35 compute-0 nova_compute[183278]: 2026-01-21 18:41:35.911 183284 DEBUG nova.network.neutron [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:41:35 compute-0 podman[213049]: 2026-01-21 18:41:35.993714109 +0000 UTC m=+0.046690679 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:41:36 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.659 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.722 183284 DEBUG nova.network.neutron [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Updating instance_info_cache with network_info: [{"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.743 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Releasing lock "refresh_cache-b53935fe-61d0-4662-9242-b4afae882b6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.744 183284 DEBUG nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Instance network_info: |[{"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.744 183284 DEBUG oslo_concurrency.lockutils [req-b3091527-f72f-4bfb-89d1-14ef79ec2786 req-dd314fdb-0994-43af-a5ad-f4b6c0eedb17 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-b53935fe-61d0-4662-9242-b4afae882b6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.744 183284 DEBUG nova.network.neutron [req-b3091527-f72f-4bfb-89d1-14ef79ec2786 req-dd314fdb-0994-43af-a5ad-f4b6c0eedb17 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Refreshing network info cache for port 31eaba9d-250a-4709-9d55-cc3e54eb722d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.747 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Start _get_guest_xml network_info=[{"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.753 183284 WARNING nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.758 183284 DEBUG nova.virt.libvirt.host [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.758 183284 DEBUG nova.virt.libvirt.host [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.764 183284 DEBUG nova.virt.libvirt.host [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.765 183284 DEBUG nova.virt.libvirt.host [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.766 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.766 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.767 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.767 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.767 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.767 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.767 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.768 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.768 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.768 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.768 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.768 183284 DEBUG nova.virt.hardware [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.771 183284 DEBUG nova.virt.libvirt.vif [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:41:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-379452612',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-379452612',id=28,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e24c6330d8214c038817e159aa32ee75',ramdisk_id='',reservation_id='r-uwu7zcyj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1066838900',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1066838900-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:41:33Z,user_data=None,user_id='02dcb8b770104e6fbacfd9aced0763ce',uuid=b53935fe-61d0-4662-9242-b4afae882b6e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.772 183284 DEBUG nova.network.os_vif_util [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Converting VIF {"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.772 183284 DEBUG nova.network.os_vif_util [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:5e:98,bridge_name='br-int',has_traffic_filtering=True,id=31eaba9d-250a-4709-9d55-cc3e54eb722d,network=Network(f06157f1-7b10-4c16-804b-abfbafeaf616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31eaba9d-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.773 183284 DEBUG nova.objects.instance [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lazy-loading 'pci_devices' on Instance uuid b53935fe-61d0-4662-9242-b4afae882b6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.793 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <uuid>b53935fe-61d0-4662-9242-b4afae882b6e</uuid>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <name>instance-0000001c</name>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-379452612</nova:name>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:41:36</nova:creationTime>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:41:36 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:41:36 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:41:36 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:41:36 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:41:36 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:41:36 compute-0 nova_compute[183278]:         <nova:user uuid="02dcb8b770104e6fbacfd9aced0763ce">tempest-TestExecuteWorkloadBalancingStrategy-1066838900-project-member</nova:user>
Jan 21 18:41:36 compute-0 nova_compute[183278]:         <nova:project uuid="e24c6330d8214c038817e159aa32ee75">tempest-TestExecuteWorkloadBalancingStrategy-1066838900</nova:project>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:41:36 compute-0 nova_compute[183278]:         <nova:port uuid="31eaba9d-250a-4709-9d55-cc3e54eb722d">
Jan 21 18:41:36 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <system>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <entry name="serial">b53935fe-61d0-4662-9242-b4afae882b6e</entry>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <entry name="uuid">b53935fe-61d0-4662-9242-b4afae882b6e</entry>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     </system>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <os>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   </os>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <features>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   </features>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk.config"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:f4:5e:98"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <target dev="tap31eaba9d-25"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/console.log" append="off"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <video>
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     </video>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:41:36 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:41:36 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:41:36 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:41:36 compute-0 nova_compute[183278]: </domain>
Jan 21 18:41:36 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.795 183284 DEBUG nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Preparing to wait for external event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.797 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.797 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.797 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.798 183284 DEBUG nova.virt.libvirt.vif [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:41:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-379452612',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-379452612',id=28,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e24c6330d8214c038817e159aa32ee75',ramdisk_id='',reservation_id='r-uwu7zcyj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1066838900',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1066838900-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:41:33Z,user_data=None,user_id='02dcb8b770104e6fbacfd9aced0763ce',uuid=b53935fe-61d0-4662-9242-b4afae882b6e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.798 183284 DEBUG nova.network.os_vif_util [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Converting VIF {"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.799 183284 DEBUG nova.network.os_vif_util [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:5e:98,bridge_name='br-int',has_traffic_filtering=True,id=31eaba9d-250a-4709-9d55-cc3e54eb722d,network=Network(f06157f1-7b10-4c16-804b-abfbafeaf616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31eaba9d-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.799 183284 DEBUG os_vif [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:5e:98,bridge_name='br-int',has_traffic_filtering=True,id=31eaba9d-250a-4709-9d55-cc3e54eb722d,network=Network(f06157f1-7b10-4c16-804b-abfbafeaf616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31eaba9d-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.800 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.801 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.801 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.805 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.805 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31eaba9d-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.806 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31eaba9d-25, col_values=(('external_ids', {'iface-id': '31eaba9d-250a-4709-9d55-cc3e54eb722d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:5e:98', 'vm-uuid': 'b53935fe-61d0-4662-9242-b4afae882b6e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.807 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:36 compute-0 NetworkManager[55506]: <info>  [1769020896.8082] manager: (tap31eaba9d-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.809 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.814 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.816 183284 INFO os_vif [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:5e:98,bridge_name='br-int',has_traffic_filtering=True,id=31eaba9d-250a-4709-9d55-cc3e54eb722d,network=Network(f06157f1-7b10-4c16-804b-abfbafeaf616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31eaba9d-25')
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.838 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.838 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.838 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.838 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.857 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.857 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.857 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] No VIF found with MAC fa:16:3e:f4:5e:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.858 183284 INFO nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Using config drive
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.893 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.948 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:36 compute-0 nova_compute[183278]: 2026-01-21 18:41:36.949 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.001 183284 DEBUG oslo_concurrency.processutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.002 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000001c, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk.config'
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.119 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.120 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5854MB free_disk=73.378662109375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.120 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.121 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.123 183284 INFO nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Creating config drive at /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk.config
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.127 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi3v2c2ck execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.201 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Instance b53935fe-61d0-4662-9242-b4afae882b6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.202 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.202 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.249 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.251 183284 DEBUG oslo_concurrency.processutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi3v2c2ck" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.287 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:41:37 compute-0 kernel: tap31eaba9d-25: entered promiscuous mode
Jan 21 18:41:37 compute-0 NetworkManager[55506]: <info>  [1769020897.3077] manager: (tap31eaba9d-25): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.359 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:41:37 compute-0 ovn_controller[95419]: 2026-01-21T18:41:37Z|00197|binding|INFO|Claiming lport 31eaba9d-250a-4709-9d55-cc3e54eb722d for this chassis.
Jan 21 18:41:37 compute-0 ovn_controller[95419]: 2026-01-21T18:41:37Z|00198|binding|INFO|31eaba9d-250a-4709-9d55-cc3e54eb722d: Claiming fa:16:3e:f4:5e:98 10.100.0.7
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.360 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.361 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.364 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.370 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.379 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:5e:98 10.100.0.7'], port_security=['fa:16:3e:f4:5e:98 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b53935fe-61d0-4662-9242-b4afae882b6e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f06157f1-7b10-4c16-804b-abfbafeaf616', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e24c6330d8214c038817e159aa32ee75', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a8e8330a-c44f-4a58-8f42-8ca6339ea3b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c0eeedc-f49a-4818-9629-e33dc15bbbda, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=31eaba9d-250a-4709-9d55-cc3e54eb722d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.380 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 31eaba9d-250a-4709-9d55-cc3e54eb722d in datapath f06157f1-7b10-4c16-804b-abfbafeaf616 bound to our chassis
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.380 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f06157f1-7b10-4c16-804b-abfbafeaf616
Jan 21 18:41:37 compute-0 systemd-udevd[213099]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:41:37 compute-0 systemd-machined[154592]: New machine qemu-19-instance-0000001c.
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.392 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3e3c48-6559-4fe0-8ffb-f946479aacf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.393 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf06157f1-71 in ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.395 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf06157f1-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.395 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[92885e75-d641-4f03-a7d6-82bcc48e4f63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.396 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4d9874-4a14-4eab-851d-6dcbaf5c943b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 NetworkManager[55506]: <info>  [1769020897.4029] device (tap31eaba9d-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:41:37 compute-0 NetworkManager[55506]: <info>  [1769020897.4038] device (tap31eaba9d-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.408 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[dd23b9d1-2203-4a37-abe4-2b5f23d872be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.420 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:37 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-0000001c.
Jan 21 18:41:37 compute-0 ovn_controller[95419]: 2026-01-21T18:41:37Z|00199|binding|INFO|Setting lport 31eaba9d-250a-4709-9d55-cc3e54eb722d ovn-installed in OVS
Jan 21 18:41:37 compute-0 ovn_controller[95419]: 2026-01-21T18:41:37Z|00200|binding|INFO|Setting lport 31eaba9d-250a-4709-9d55-cc3e54eb722d up in Southbound
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.423 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.430 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[51f06896-a3e7-4d5d-8478-2b30f8b659b8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.457 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[97105774-3f14-4385-a8a9-3b439a5b9932]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.462 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd3a2a0-638c-4f71-88ec-860e065f0c29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 systemd-udevd[213103]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:41:37 compute-0 NetworkManager[55506]: <info>  [1769020897.4633] manager: (tapf06157f1-70): new Veth device (/org/freedesktop/NetworkManager/Devices/86)
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.491 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[e81e6bb1-db9a-46f8-b287-7975d9285439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.494 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4506b5-0d54-4808-a838-9ed781d43651]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 NetworkManager[55506]: <info>  [1769020897.5166] device (tapf06157f1-70): carrier: link connected
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.520 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e65216-ca9f-4716-8209-ad7d4105d900]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.537 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[717b12f9-6f67-4268-be11-298405831c29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf06157f1-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:c3:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546112, 'reachable_time': 35812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213132, 'error': None, 'target': 'ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.550 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[44c903e0-1a77-4bfd-9512-f294d22ba390]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:c351'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546112, 'tstamp': 546112}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213133, 'error': None, 'target': 'ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.568 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[fa27b311-7ed8-4690-8691-43665c07f624]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf06157f1-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:c3:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546112, 'reachable_time': 35812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213134, 'error': None, 'target': 'ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.601 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[c5dc15b7-5712-4fbd-bb4e-702fd4e4702b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.631 183284 DEBUG nova.compute.manager [req-13a86b65-4364-4a53-8f83-dc7ba451a7e5 req-95a846e0-bc25-4ebb-ae32-f528493b0997 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.631 183284 DEBUG oslo_concurrency.lockutils [req-13a86b65-4364-4a53-8f83-dc7ba451a7e5 req-95a846e0-bc25-4ebb-ae32-f528493b0997 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.637 183284 DEBUG oslo_concurrency.lockutils [req-13a86b65-4364-4a53-8f83-dc7ba451a7e5 req-95a846e0-bc25-4ebb-ae32-f528493b0997 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.637 183284 DEBUG oslo_concurrency.lockutils [req-13a86b65-4364-4a53-8f83-dc7ba451a7e5 req-95a846e0-bc25-4ebb-ae32-f528493b0997 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.638 183284 DEBUG nova.compute.manager [req-13a86b65-4364-4a53-8f83-dc7ba451a7e5 req-95a846e0-bc25-4ebb-ae32-f528493b0997 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Processing event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.663 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9071c096-6384-4f1f-bcff-7c565d57a027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.664 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf06157f1-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.664 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.665 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf06157f1-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.666 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:37 compute-0 NetworkManager[55506]: <info>  [1769020897.6673] manager: (tapf06157f1-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 21 18:41:37 compute-0 kernel: tapf06157f1-70: entered promiscuous mode
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.668 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.673 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf06157f1-70, col_values=(('external_ids', {'iface-id': 'e4da62b0-9395-41ec-8186-822e6f302609'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.674 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:37 compute-0 ovn_controller[95419]: 2026-01-21T18:41:37Z|00201|binding|INFO|Releasing lport e4da62b0-9395-41ec-8186-822e6f302609 from this chassis (sb_readonly=0)
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.677 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f06157f1-7b10-4c16-804b-abfbafeaf616.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f06157f1-7b10-4c16-804b-abfbafeaf616.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.686 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[93492106-155a-4138-b82f-b6052bf8491f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.687 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.688 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-f06157f1-7b10-4c16-804b-abfbafeaf616
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/f06157f1-7b10-4c16-804b-abfbafeaf616.pid.haproxy
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID f06157f1-7b10-4c16-804b-abfbafeaf616
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:41:37 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:37.689 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616', 'env', 'PROCESS_TAG=haproxy-f06157f1-7b10-4c16-804b-abfbafeaf616', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f06157f1-7b10-4c16-804b-abfbafeaf616.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.718 183284 DEBUG nova.network.neutron [req-b3091527-f72f-4bfb-89d1-14ef79ec2786 req-dd314fdb-0994-43af-a5ad-f4b6c0eedb17 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Updated VIF entry in instance network info cache for port 31eaba9d-250a-4709-9d55-cc3e54eb722d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.719 183284 DEBUG nova.network.neutron [req-b3091527-f72f-4bfb-89d1-14ef79ec2786 req-dd314fdb-0994-43af-a5ad-f4b6c0eedb17 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Updating instance_info_cache with network_info: [{"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:41:37 compute-0 nova_compute[183278]: 2026-01-21 18:41:37.735 183284 DEBUG oslo_concurrency.lockutils [req-b3091527-f72f-4bfb-89d1-14ef79ec2786 req-dd314fdb-0994-43af-a5ad-f4b6c0eedb17 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-b53935fe-61d0-4662-9242-b4afae882b6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:41:38 compute-0 podman[213166]: 2026-01-21 18:41:38.041328859 +0000 UTC m=+0.047573671 container create 4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:41:38 compute-0 systemd[1]: Started libpod-conmon-4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9.scope.
Jan 21 18:41:38 compute-0 podman[213166]: 2026-01-21 18:41:38.01777555 +0000 UTC m=+0.024020392 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:41:38 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:41:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b67ca047e9e75ed69a0653b909cc623732bf5df21b033fd2cdb6dac1d9d9029/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:41:38 compute-0 podman[213166]: 2026-01-21 18:41:38.137929345 +0000 UTC m=+0.144174177 container init 4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:41:38 compute-0 podman[213166]: 2026-01-21 18:41:38.143204141 +0000 UTC m=+0.149448953 container start 4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 18:41:38 compute-0 neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616[213181]: [NOTICE]   (213185) : New worker (213187) forked
Jan 21 18:41:38 compute-0 neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616[213181]: [NOTICE]   (213185) : Loading success.
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.321 183284 DEBUG nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.323 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020898.3208249, b53935fe-61d0-4662-9242-b4afae882b6e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.324 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] VM Started (Lifecycle Event)
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.327 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.330 183284 INFO nova.virt.libvirt.driver [-] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Instance spawned successfully.
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.331 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.349 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.354 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.358 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.358 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.359 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.359 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.359 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.360 183284 DEBUG nova.virt.libvirt.driver [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.363 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.373 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.373 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020898.3241222, b53935fe-61d0-4662-9242-b4afae882b6e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.373 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] VM Paused (Lifecycle Event)
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.497 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.500 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020898.3248305, b53935fe-61d0-4662-9242-b4afae882b6e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.500 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] VM Resumed (Lifecycle Event)
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.699 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.703 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.746 183284 INFO nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Took 5.12 seconds to spawn the instance on the hypervisor.
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.746 183284 DEBUG nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.855 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.891 183284 INFO nova.compute.manager [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Took 5.68 seconds to build instance.
Jan 21 18:41:38 compute-0 nova_compute[183278]: 2026-01-21 18:41:38.910 183284 DEBUG oslo_concurrency.lockutils [None req-b8e44020-ce29-46a5-8036-edcf5c9ca36c 02dcb8b770104e6fbacfd9aced0763ce e24c6330d8214c038817e159aa32ee75 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:39 compute-0 nova_compute[183278]: 2026-01-21 18:41:39.723 183284 DEBUG nova.compute.manager [req-70268b3e-bade-43d5-9c34-6b9e64530f06 req-dfcdfee2-3fab-44d9-a033-60d71ce316b5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:39 compute-0 nova_compute[183278]: 2026-01-21 18:41:39.723 183284 DEBUG oslo_concurrency.lockutils [req-70268b3e-bade-43d5-9c34-6b9e64530f06 req-dfcdfee2-3fab-44d9-a033-60d71ce316b5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:39 compute-0 nova_compute[183278]: 2026-01-21 18:41:39.723 183284 DEBUG oslo_concurrency.lockutils [req-70268b3e-bade-43d5-9c34-6b9e64530f06 req-dfcdfee2-3fab-44d9-a033-60d71ce316b5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:39 compute-0 nova_compute[183278]: 2026-01-21 18:41:39.724 183284 DEBUG oslo_concurrency.lockutils [req-70268b3e-bade-43d5-9c34-6b9e64530f06 req-dfcdfee2-3fab-44d9-a033-60d71ce316b5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:39 compute-0 nova_compute[183278]: 2026-01-21 18:41:39.724 183284 DEBUG nova.compute.manager [req-70268b3e-bade-43d5-9c34-6b9e64530f06 req-dfcdfee2-3fab-44d9-a033-60d71ce316b5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] No waiting events found dispatching network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:41:39 compute-0 nova_compute[183278]: 2026-01-21 18:41:39.724 183284 WARNING nova.compute.manager [req-70268b3e-bade-43d5-9c34-6b9e64530f06 req-dfcdfee2-3fab-44d9-a033-60d71ce316b5 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received unexpected event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d for instance with vm_state active and task_state None.
Jan 21 18:41:39 compute-0 nova_compute[183278]: 2026-01-21 18:41:39.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:39 compute-0 nova_compute[183278]: 2026-01-21 18:41:39.829 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:41 compute-0 nova_compute[183278]: 2026-01-21 18:41:41.662 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:41 compute-0 nova_compute[183278]: 2026-01-21 18:41:41.808 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:44 compute-0 nova_compute[183278]: 2026-01-21 18:41:44.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:41:44 compute-0 nova_compute[183278]: 2026-01-21 18:41:44.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:41:46 compute-0 nova_compute[183278]: 2026-01-21 18:41:46.664 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:46 compute-0 nova_compute[183278]: 2026-01-21 18:41:46.809 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:47 compute-0 nova_compute[183278]: 2026-01-21 18:41:47.637 183284 DEBUG nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Check if temp file /var/lib/nova/instances/tmp58siowe1 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 18:41:47 compute-0 nova_compute[183278]: 2026-01-21 18:41:47.638 183284 DEBUG nova.compute.manager [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp58siowe1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b53935fe-61d0-4662-9242-b4afae882b6e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 18:41:48 compute-0 nova_compute[183278]: 2026-01-21 18:41:48.331 183284 DEBUG oslo_concurrency.processutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:48 compute-0 nova_compute[183278]: 2026-01-21 18:41:48.387 183284 DEBUG oslo_concurrency.processutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:48 compute-0 nova_compute[183278]: 2026-01-21 18:41:48.388 183284 DEBUG oslo_concurrency.processutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:41:48 compute-0 nova_compute[183278]: 2026-01-21 18:41:48.448 183284 DEBUG oslo_concurrency.processutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:41:50 compute-0 ovn_controller[95419]: 2026-01-21T18:41:50Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:5e:98 10.100.0.7
Jan 21 18:41:50 compute-0 ovn_controller[95419]: 2026-01-21T18:41:50Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:5e:98 10.100.0.7
Jan 21 18:41:50 compute-0 sshd-session[213229]: Accepted publickey for nova from 192.168.122.101 port 41072 ssh2: ECDSA SHA256:29a5JNhHHz2bb0ACqZTr6qOKeSRnhiTRA8SK+rzn9gs
Jan 21 18:41:50 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:41:50 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:41:50 compute-0 systemd-logind[782]: New session 42 of user nova.
Jan 21 18:41:50 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:41:50 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:41:50 compute-0 systemd[213233]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:41:50 compute-0 systemd[213233]: Queued start job for default target Main User Target.
Jan 21 18:41:50 compute-0 systemd[213233]: Created slice User Application Slice.
Jan 21 18:41:50 compute-0 systemd[213233]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:41:50 compute-0 systemd[213233]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:41:50 compute-0 systemd[213233]: Reached target Paths.
Jan 21 18:41:50 compute-0 systemd[213233]: Reached target Timers.
Jan 21 18:41:50 compute-0 systemd[213233]: Starting D-Bus User Message Bus Socket...
Jan 21 18:41:50 compute-0 systemd[213233]: Starting Create User's Volatile Files and Directories...
Jan 21 18:41:50 compute-0 systemd[213233]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:41:50 compute-0 systemd[213233]: Reached target Sockets.
Jan 21 18:41:50 compute-0 systemd[213233]: Finished Create User's Volatile Files and Directories.
Jan 21 18:41:50 compute-0 systemd[213233]: Reached target Basic System.
Jan 21 18:41:50 compute-0 systemd[213233]: Reached target Main User Target.
Jan 21 18:41:50 compute-0 systemd[213233]: Startup finished in 135ms.
Jan 21 18:41:50 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:41:50 compute-0 systemd[1]: Started Session 42 of User nova.
Jan 21 18:41:50 compute-0 sshd-session[213229]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:41:51 compute-0 sshd-session[213248]: Received disconnect from 192.168.122.101 port 41072:11: disconnected by user
Jan 21 18:41:51 compute-0 sshd-session[213248]: Disconnected from user nova 192.168.122.101 port 41072
Jan 21 18:41:51 compute-0 sshd-session[213229]: pam_unix(sshd:session): session closed for user nova
Jan 21 18:41:51 compute-0 systemd[1]: session-42.scope: Deactivated successfully.
Jan 21 18:41:51 compute-0 systemd-logind[782]: Session 42 logged out. Waiting for processes to exit.
Jan 21 18:41:51 compute-0 systemd-logind[782]: Removed session 42.
Jan 21 18:41:51 compute-0 nova_compute[183278]: 2026-01-21 18:41:51.665 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:51 compute-0 nova_compute[183278]: 2026-01-21 18:41:51.699 183284 DEBUG nova.compute.manager [req-94d239f1-5d5b-4a6e-9577-0b0605440a16 req-de08c49b-2fed-429b-8953-81e00c8db9dd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-unplugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:51 compute-0 nova_compute[183278]: 2026-01-21 18:41:51.700 183284 DEBUG oslo_concurrency.lockutils [req-94d239f1-5d5b-4a6e-9577-0b0605440a16 req-de08c49b-2fed-429b-8953-81e00c8db9dd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:51 compute-0 nova_compute[183278]: 2026-01-21 18:41:51.700 183284 DEBUG oslo_concurrency.lockutils [req-94d239f1-5d5b-4a6e-9577-0b0605440a16 req-de08c49b-2fed-429b-8953-81e00c8db9dd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:51 compute-0 nova_compute[183278]: 2026-01-21 18:41:51.700 183284 DEBUG oslo_concurrency.lockutils [req-94d239f1-5d5b-4a6e-9577-0b0605440a16 req-de08c49b-2fed-429b-8953-81e00c8db9dd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:51 compute-0 nova_compute[183278]: 2026-01-21 18:41:51.700 183284 DEBUG nova.compute.manager [req-94d239f1-5d5b-4a6e-9577-0b0605440a16 req-de08c49b-2fed-429b-8953-81e00c8db9dd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] No waiting events found dispatching network-vif-unplugged-31eaba9d-250a-4709-9d55-cc3e54eb722d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:41:51 compute-0 nova_compute[183278]: 2026-01-21 18:41:51.701 183284 DEBUG nova.compute.manager [req-94d239f1-5d5b-4a6e-9577-0b0605440a16 req-de08c49b-2fed-429b-8953-81e00c8db9dd 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-unplugged-31eaba9d-250a-4709-9d55-cc3e54eb722d for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:41:51 compute-0 nova_compute[183278]: 2026-01-21 18:41:51.773 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:51.773 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:41:51 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:51.774 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:41:51 compute-0 nova_compute[183278]: 2026-01-21 18:41:51.811 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.284 183284 INFO nova.compute.manager [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Took 3.83 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.284 183284 DEBUG nova.compute.manager [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.301 183284 DEBUG nova.compute.manager [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp58siowe1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b53935fe-61d0-4662-9242-b4afae882b6e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(da0f7378-4758-4827-b3bf-a67f6a912f5c),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.322 183284 DEBUG nova.objects.instance [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lazy-loading 'migration_context' on Instance uuid b53935fe-61d0-4662-9242-b4afae882b6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.324 183284 DEBUG nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.325 183284 DEBUG nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.326 183284 DEBUG nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.349 183284 DEBUG nova.virt.libvirt.vif [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:41:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-379452612',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-379452612',id=28,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:41:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e24c6330d8214c038817e159aa32ee75',ramdisk_id='',reservation_id='r-uwu7zcyj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1066838900',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1066838900-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:41:38Z,user_data=None,user_id='02dcb8b770104e6fbacfd9aced0763ce',uuid=b53935fe-61d0-4662-9242-b4afae882b6e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.350 183284 DEBUG nova.network.os_vif_util [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Converting VIF {"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.351 183284 DEBUG nova.network.os_vif_util [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:5e:98,bridge_name='br-int',has_traffic_filtering=True,id=31eaba9d-250a-4709-9d55-cc3e54eb722d,network=Network(f06157f1-7b10-4c16-804b-abfbafeaf616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31eaba9d-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.351 183284 DEBUG nova.virt.libvirt.migration [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 18:41:52 compute-0 nova_compute[183278]:   <mac address="fa:16:3e:f4:5e:98"/>
Jan 21 18:41:52 compute-0 nova_compute[183278]:   <model type="virtio"/>
Jan 21 18:41:52 compute-0 nova_compute[183278]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:41:52 compute-0 nova_compute[183278]:   <mtu size="1442"/>
Jan 21 18:41:52 compute-0 nova_compute[183278]:   <target dev="tap31eaba9d-25"/>
Jan 21 18:41:52 compute-0 nova_compute[183278]: </interface>
Jan 21 18:41:52 compute-0 nova_compute[183278]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.352 183284 DEBUG nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.828 183284 DEBUG nova.virt.libvirt.migration [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.829 183284 INFO nova.virt.libvirt.migration [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 18:41:52 compute-0 nova_compute[183278]: 2026-01-21 18:41:52.906 183284 INFO nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.411 183284 DEBUG nova.virt.libvirt.migration [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.412 183284 DEBUG nova.virt.libvirt.migration [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.820 183284 DEBUG nova.compute.manager [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.820 183284 DEBUG oslo_concurrency.lockutils [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.821 183284 DEBUG oslo_concurrency.lockutils [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.821 183284 DEBUG oslo_concurrency.lockutils [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.821 183284 DEBUG nova.compute.manager [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] No waiting events found dispatching network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.821 183284 WARNING nova.compute.manager [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received unexpected event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d for instance with vm_state active and task_state migrating.
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.822 183284 DEBUG nova.compute.manager [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-changed-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.822 183284 DEBUG nova.compute.manager [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Refreshing instance network info cache due to event network-changed-31eaba9d-250a-4709-9d55-cc3e54eb722d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.822 183284 DEBUG oslo_concurrency.lockutils [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-b53935fe-61d0-4662-9242-b4afae882b6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.823 183284 DEBUG oslo_concurrency.lockutils [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-b53935fe-61d0-4662-9242-b4afae882b6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.823 183284 DEBUG nova.network.neutron [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Refreshing network info cache for port 31eaba9d-250a-4709-9d55-cc3e54eb722d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.914 183284 DEBUG nova.virt.libvirt.migration [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.915 183284 DEBUG nova.virt.libvirt.migration [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.968 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769020913.968156, b53935fe-61d0-4662-9242-b4afae882b6e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.969 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] VM Paused (Lifecycle Event)
Jan 21 18:41:53 compute-0 nova_compute[183278]: 2026-01-21 18:41:53.993 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.052 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.073 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 18:41:54 compute-0 kernel: tap31eaba9d-25 (unregistering): left promiscuous mode
Jan 21 18:41:54 compute-0 NetworkManager[55506]: <info>  [1769020914.1951] device (tap31eaba9d-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.203 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:54 compute-0 ovn_controller[95419]: 2026-01-21T18:41:54Z|00202|binding|INFO|Releasing lport 31eaba9d-250a-4709-9d55-cc3e54eb722d from this chassis (sb_readonly=0)
Jan 21 18:41:54 compute-0 ovn_controller[95419]: 2026-01-21T18:41:54Z|00203|binding|INFO|Setting lport 31eaba9d-250a-4709-9d55-cc3e54eb722d down in Southbound
Jan 21 18:41:54 compute-0 ovn_controller[95419]: 2026-01-21T18:41:54Z|00204|binding|INFO|Removing iface tap31eaba9d-25 ovn-installed in OVS
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.209 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:5e:98 10.100.0.7'], port_security=['fa:16:3e:f4:5e:98 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '88a62794-b4a4-47e3-9cce-91e574e684c1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b53935fe-61d0-4662-9242-b4afae882b6e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f06157f1-7b10-4c16-804b-abfbafeaf616', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e24c6330d8214c038817e159aa32ee75', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a8e8330a-c44f-4a58-8f42-8ca6339ea3b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c0eeedc-f49a-4818-9629-e33dc15bbbda, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=31eaba9d-250a-4709-9d55-cc3e54eb722d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.210 104698 INFO neutron.agent.ovn.metadata.agent [-] Port 31eaba9d-250a-4709-9d55-cc3e54eb722d in datapath f06157f1-7b10-4c16-804b-abfbafeaf616 unbound from our chassis
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.211 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f06157f1-7b10-4c16-804b-abfbafeaf616, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.212 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[411d558c-be2f-47e0-b822-e8ce00a906a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.212 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616 namespace which is not needed anymore
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.219 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:54 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 21 18:41:54 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001c.scope: Consumed 13.810s CPU time.
Jan 21 18:41:54 compute-0 systemd-machined[154592]: Machine qemu-19-instance-0000001c terminated.
Jan 21 18:41:54 compute-0 podman[213256]: 2026-01-21 18:41:54.305567033 +0000 UTC m=+0.069389749 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Jan 21 18:41:54 compute-0 neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616[213181]: [NOTICE]   (213185) : haproxy version is 2.8.14-c23fe91
Jan 21 18:41:54 compute-0 neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616[213181]: [NOTICE]   (213185) : path to executable is /usr/sbin/haproxy
Jan 21 18:41:54 compute-0 neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616[213181]: [WARNING]  (213185) : Exiting Master process...
Jan 21 18:41:54 compute-0 neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616[213181]: [ALERT]    (213185) : Current worker (213187) exited with code 143 (Terminated)
Jan 21 18:41:54 compute-0 neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616[213181]: [WARNING]  (213185) : All workers exited. Exiting... (0)
Jan 21 18:41:54 compute-0 systemd[1]: libpod-4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9.scope: Deactivated successfully.
Jan 21 18:41:54 compute-0 podman[213297]: 2026-01-21 18:41:54.364243371 +0000 UTC m=+0.050616595 container died 4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.437 183284 DEBUG nova.virt.libvirt.guest [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.438 183284 INFO nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Migration operation has completed
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.438 183284 INFO nova.compute.manager [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] _post_live_migration() is started..
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.445 183284 DEBUG nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.445 183284 DEBUG nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.445 183284 DEBUG nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 18:41:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9-userdata-shm.mount: Deactivated successfully.
Jan 21 18:41:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-8b67ca047e9e75ed69a0653b909cc623732bf5df21b033fd2cdb6dac1d9d9029-merged.mount: Deactivated successfully.
Jan 21 18:41:54 compute-0 podman[213297]: 2026-01-21 18:41:54.466261146 +0000 UTC m=+0.152634370 container cleanup 4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:41:54 compute-0 systemd[1]: libpod-conmon-4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9.scope: Deactivated successfully.
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.492 183284 DEBUG nova.compute.manager [req-4b8b1782-766a-4bf1-a996-c7ce9f696bf9 req-621f1d3d-cb4f-4c15-a543-7b6312a6267c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-unplugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.493 183284 DEBUG oslo_concurrency.lockutils [req-4b8b1782-766a-4bf1-a996-c7ce9f696bf9 req-621f1d3d-cb4f-4c15-a543-7b6312a6267c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.493 183284 DEBUG oslo_concurrency.lockutils [req-4b8b1782-766a-4bf1-a996-c7ce9f696bf9 req-621f1d3d-cb4f-4c15-a543-7b6312a6267c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.493 183284 DEBUG oslo_concurrency.lockutils [req-4b8b1782-766a-4bf1-a996-c7ce9f696bf9 req-621f1d3d-cb4f-4c15-a543-7b6312a6267c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.493 183284 DEBUG nova.compute.manager [req-4b8b1782-766a-4bf1-a996-c7ce9f696bf9 req-621f1d3d-cb4f-4c15-a543-7b6312a6267c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] No waiting events found dispatching network-vif-unplugged-31eaba9d-250a-4709-9d55-cc3e54eb722d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.494 183284 DEBUG nova.compute.manager [req-4b8b1782-766a-4bf1-a996-c7ce9f696bf9 req-621f1d3d-cb4f-4c15-a543-7b6312a6267c 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-unplugged-31eaba9d-250a-4709-9d55-cc3e54eb722d for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:41:54 compute-0 podman[213349]: 2026-01-21 18:41:54.526738559 +0000 UTC m=+0.040257674 container remove 4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.531 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[3b97b7fc-3671-43b3-a590-0baa34cbf5da]: (4, ('Wed Jan 21 06:41:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616 (4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9)\n4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9\nWed Jan 21 06:41:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616 (4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9)\n4337408152a2dfd0cb773578c5e4ebec44e809f3c3e302cb8cc0301269ff45a9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.533 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9d1ef703-1ea8-4ee3-85a1-66f8ec8b3e06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.533 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf06157f1-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.535 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:54 compute-0 kernel: tapf06157f1-70: left promiscuous mode
Jan 21 18:41:54 compute-0 nova_compute[183278]: 2026-01-21 18:41:54.551 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.553 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[95684f7b-f31f-4b0d-8665-ce661d6d6122]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.573 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[2618b252-692a-4fd1-b9f7-946ee91550d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.574 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9b20faea-4b23-4ef0-8eb3-7c6286b7d5da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.590 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f3575ffb-5638-4b7a-a90a-ddb4c2b6118f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546106, 'reachable_time': 36858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213368, 'error': None, 'target': 'ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.592 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f06157f1-7b10-4c16-804b-abfbafeaf616 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:41:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:54.592 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[094d49f1-de3d-4a15-8bf6-cb0ca2045184]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:41:54 compute-0 systemd[1]: run-netns-ovnmeta\x2df06157f1\x2d7b10\x2d4c16\x2d804b\x2dabfbafeaf616.mount: Deactivated successfully.
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.027 183284 DEBUG nova.network.neutron [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Updated VIF entry in instance network info cache for port 31eaba9d-250a-4709-9d55-cc3e54eb722d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.027 183284 DEBUG nova.network.neutron [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Updating instance_info_cache with network_info: [{"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.048 183284 DEBUG oslo_concurrency.lockutils [req-87af27bf-6283-4107-9c2a-e413bd6c053c req-6ab22fd1-bdda-4487-9505-537a60a15de4 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-b53935fe-61d0-4662-9242-b4afae882b6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.285 183284 DEBUG nova.network.neutron [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Activated binding for port 31eaba9d-250a-4709-9d55-cc3e54eb722d and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.286 183284 DEBUG nova.compute.manager [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.286 183284 DEBUG nova.virt.libvirt.vif [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:41:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-379452612',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-379452612',id=28,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:41:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e24c6330d8214c038817e159aa32ee75',ramdisk_id='',reservation_id='r-uwu7zcyj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1066838900',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1066838900-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:41:44Z,user_data=None,user_id='02dcb8b770104e6fbacfd9aced0763ce',uuid=b53935fe-61d0-4662-9242-b4afae882b6e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.287 183284 DEBUG nova.network.os_vif_util [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Converting VIF {"id": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "address": "fa:16:3e:f4:5e:98", "network": {"id": "f06157f1-7b10-4c16-804b-abfbafeaf616", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1767892451-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e24c6330d8214c038817e159aa32ee75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31eaba9d-25", "ovs_interfaceid": "31eaba9d-250a-4709-9d55-cc3e54eb722d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.288 183284 DEBUG nova.network.os_vif_util [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:5e:98,bridge_name='br-int',has_traffic_filtering=True,id=31eaba9d-250a-4709-9d55-cc3e54eb722d,network=Network(f06157f1-7b10-4c16-804b-abfbafeaf616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31eaba9d-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.288 183284 DEBUG os_vif [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:5e:98,bridge_name='br-int',has_traffic_filtering=True,id=31eaba9d-250a-4709-9d55-cc3e54eb722d,network=Network(f06157f1-7b10-4c16-804b-abfbafeaf616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31eaba9d-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.289 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.290 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31eaba9d-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.291 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.292 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.294 183284 INFO os_vif [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:5e:98,bridge_name='br-int',has_traffic_filtering=True,id=31eaba9d-250a-4709-9d55-cc3e54eb722d,network=Network(f06157f1-7b10-4c16-804b-abfbafeaf616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31eaba9d-25')
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.294 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.295 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.295 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.295 183284 DEBUG nova.compute.manager [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.295 183284 INFO nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Deleting instance files /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e_del
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.296 183284 INFO nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Deletion of /var/lib/nova/instances/b53935fe-61d0-4662-9242-b4afae882b6e_del complete
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.910 183284 DEBUG nova.compute.manager [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-unplugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.910 183284 DEBUG oslo_concurrency.lockutils [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.911 183284 DEBUG oslo_concurrency.lockutils [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.911 183284 DEBUG oslo_concurrency.lockutils [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.911 183284 DEBUG nova.compute.manager [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] No waiting events found dispatching network-vif-unplugged-31eaba9d-250a-4709-9d55-cc3e54eb722d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.911 183284 DEBUG nova.compute.manager [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-unplugged-31eaba9d-250a-4709-9d55-cc3e54eb722d for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.911 183284 DEBUG nova.compute.manager [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.912 183284 DEBUG oslo_concurrency.lockutils [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.912 183284 DEBUG oslo_concurrency.lockutils [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.912 183284 DEBUG oslo_concurrency.lockutils [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.912 183284 DEBUG nova.compute.manager [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] No waiting events found dispatching network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:41:55 compute-0 nova_compute[183278]: 2026-01-21 18:41:55.912 183284 WARNING nova.compute.manager [req-bccab46a-64a2-4f47-88db-aa879c7f2d93 req-004627eb-59a6-4db7-adc7-b3dc837a8aeb 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received unexpected event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d for instance with vm_state active and task_state migrating.
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.568 183284 DEBUG nova.compute.manager [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.569 183284 DEBUG oslo_concurrency.lockutils [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.569 183284 DEBUG oslo_concurrency.lockutils [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.570 183284 DEBUG oslo_concurrency.lockutils [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.570 183284 DEBUG nova.compute.manager [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] No waiting events found dispatching network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.570 183284 WARNING nova.compute.manager [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received unexpected event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d for instance with vm_state active and task_state migrating.
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.570 183284 DEBUG nova.compute.manager [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.571 183284 DEBUG oslo_concurrency.lockutils [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.571 183284 DEBUG oslo_concurrency.lockutils [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.571 183284 DEBUG oslo_concurrency.lockutils [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.571 183284 DEBUG nova.compute.manager [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] No waiting events found dispatching network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.571 183284 WARNING nova.compute.manager [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received unexpected event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d for instance with vm_state active and task_state migrating.
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.572 183284 DEBUG nova.compute.manager [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.572 183284 DEBUG oslo_concurrency.lockutils [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.572 183284 DEBUG oslo_concurrency.lockutils [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.572 183284 DEBUG oslo_concurrency.lockutils [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.572 183284 DEBUG nova.compute.manager [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] No waiting events found dispatching network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.573 183284 WARNING nova.compute.manager [req-fbce3482-f467-4d87-8b8a-5e97407b6852 req-7e30f1a3-ae2a-4f49-9f1c-2faeabb44ed0 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Received unexpected event network-vif-plugged-31eaba9d-250a-4709-9d55-cc3e54eb722d for instance with vm_state active and task_state migrating.
Jan 21 18:41:56 compute-0 nova_compute[183278]: 2026-01-21 18:41:56.667 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:41:57 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:41:57.776 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:41:59 compute-0 podman[192560]: time="2026-01-21T18:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:41:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:41:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 21 18:42:00 compute-0 nova_compute[183278]: 2026-01-21 18:42:00.292 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.149 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Acquiring lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.150 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.150 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "b53935fe-61d0-4662-9242-b4afae882b6e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.172 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.172 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.172 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.172 183284 DEBUG nova.compute.resource_tracker [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:42:01 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:42:01 compute-0 systemd[213233]: Activating special unit Exit the Session...
Jan 21 18:42:01 compute-0 systemd[213233]: Stopped target Main User Target.
Jan 21 18:42:01 compute-0 systemd[213233]: Stopped target Basic System.
Jan 21 18:42:01 compute-0 systemd[213233]: Stopped target Paths.
Jan 21 18:42:01 compute-0 systemd[213233]: Stopped target Sockets.
Jan 21 18:42:01 compute-0 systemd[213233]: Stopped target Timers.
Jan 21 18:42:01 compute-0 systemd[213233]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:42:01 compute-0 systemd[213233]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:42:01 compute-0 systemd[213233]: Closed D-Bus User Message Bus Socket.
Jan 21 18:42:01 compute-0 systemd[213233]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:42:01 compute-0 systemd[213233]: Removed slice User Application Slice.
Jan 21 18:42:01 compute-0 systemd[213233]: Reached target Shutdown.
Jan 21 18:42:01 compute-0 systemd[213233]: Finished Exit the Session.
Jan 21 18:42:01 compute-0 systemd[213233]: Reached target Exit the Session.
Jan 21 18:42:01 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:42:01 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:42:01 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:42:01 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:42:01 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:42:01 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:42:01 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:42:01 compute-0 podman[213370]: 2026-01-21 18:42:01.279464521 +0000 UTC m=+0.081153423 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true)
Jan 21 18:42:01 compute-0 podman[213369]: 2026-01-21 18:42:01.327609195 +0000 UTC m=+0.132624128 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.373 183284 WARNING nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.374 183284 DEBUG nova.compute.resource_tracker [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5791MB free_disk=73.37881851196289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.374 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.375 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:42:01 compute-0 openstack_network_exporter[195402]: ERROR   18:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:42:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:42:01 compute-0 openstack_network_exporter[195402]: ERROR   18:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:42:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.435 183284 DEBUG nova.compute.resource_tracker [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Migration for instance b53935fe-61d0-4662-9242-b4afae882b6e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.454 183284 DEBUG nova.compute.resource_tracker [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.485 183284 DEBUG nova.compute.resource_tracker [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Migration da0f7378-4758-4827-b3bf-a67f6a912f5c is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.485 183284 DEBUG nova.compute.resource_tracker [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.486 183284 DEBUG nova.compute.resource_tracker [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.536 183284 DEBUG nova.compute.provider_tree [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.549 183284 DEBUG nova.scheduler.client.report [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.567 183284 DEBUG nova.compute.resource_tracker [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.567 183284 DEBUG oslo_concurrency.lockutils [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.572 183284 INFO nova.compute.manager [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.669 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.688 183284 INFO nova.scheduler.client.report [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] Deleted allocation for migration da0f7378-4758-4827-b3bf-a67f6a912f5c
Jan 21 18:42:01 compute-0 nova_compute[183278]: 2026-01-21 18:42:01.688 183284 DEBUG nova.virt.libvirt.driver [None req-9f23b458-558d-42f2-87f6-59ac7b33b781 d6633a6e7c4844c9bdd54f29575422cf 05bd8d3790e24414a90ecf55ee1939e1 - - default default] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 18:42:05 compute-0 nova_compute[183278]: 2026-01-21 18:42:05.294 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:06 compute-0 nova_compute[183278]: 2026-01-21 18:42:06.687 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:07 compute-0 podman[213416]: 2026-01-21 18:42:07.05132991 +0000 UTC m=+0.092063476 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:42:09 compute-0 nova_compute[183278]: 2026-01-21 18:42:09.438 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769020914.4368794, b53935fe-61d0-4662-9242-b4afae882b6e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:42:09 compute-0 nova_compute[183278]: 2026-01-21 18:42:09.439 183284 INFO nova.compute.manager [-] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] VM Stopped (Lifecycle Event)
Jan 21 18:42:09 compute-0 nova_compute[183278]: 2026-01-21 18:42:09.480 183284 DEBUG nova.compute.manager [None req-41011390-5d51-413c-b549-807befec468f - - - - - -] [instance: b53935fe-61d0-4662-9242-b4afae882b6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:42:10 compute-0 nova_compute[183278]: 2026-01-21 18:42:10.344 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:11 compute-0 nova_compute[183278]: 2026-01-21 18:42:11.688 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:15 compute-0 nova_compute[183278]: 2026-01-21 18:42:15.348 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:16 compute-0 nova_compute[183278]: 2026-01-21 18:42:16.690 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:42:20.114 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:42:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:42:20.115 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:42:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:42:20.115 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:42:20 compute-0 nova_compute[183278]: 2026-01-21 18:42:20.399 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:21 compute-0 nova_compute[183278]: 2026-01-21 18:42:21.692 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:21 compute-0 sshd-session[213414]: Connection closed by 162.142.125.42 port 54536 [preauth]
Jan 21 18:42:25 compute-0 podman[213440]: 2026-01-21 18:42:25.00728267 +0000 UTC m=+0.055261676 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 18:42:25 compute-0 nova_compute[183278]: 2026-01-21 18:42:25.404 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:26 compute-0 sshd-session[213463]: Invalid user ubuntu from 64.227.98.100 port 37896
Jan 21 18:42:26 compute-0 sshd-session[213463]: Connection closed by invalid user ubuntu 64.227.98.100 port 37896 [preauth]
Jan 21 18:42:26 compute-0 nova_compute[183278]: 2026-01-21 18:42:26.696 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:27 compute-0 nova_compute[183278]: 2026-01-21 18:42:27.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:42:27 compute-0 nova_compute[183278]: 2026-01-21 18:42:27.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:42:27 compute-0 nova_compute[183278]: 2026-01-21 18:42:27.818 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:42:27 compute-0 nova_compute[183278]: 2026-01-21 18:42:27.840 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:42:29 compute-0 podman[192560]: time="2026-01-21T18:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:42:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:42:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 21 18:42:30 compute-0 nova_compute[183278]: 2026-01-21 18:42:30.449 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:31 compute-0 openstack_network_exporter[195402]: ERROR   18:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:42:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:42:31 compute-0 openstack_network_exporter[195402]: ERROR   18:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:42:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:42:31 compute-0 nova_compute[183278]: 2026-01-21 18:42:31.699 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:31 compute-0 podman[213466]: 2026-01-21 18:42:31.987642274 +0000 UTC m=+0.043563975 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 18:42:32 compute-0 podman[213465]: 2026-01-21 18:42:32.022468467 +0000 UTC m=+0.079265579 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 21 18:42:33 compute-0 nova_compute[183278]: 2026-01-21 18:42:33.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:42:33 compute-0 nova_compute[183278]: 2026-01-21 18:42:33.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:42:33 compute-0 nova_compute[183278]: 2026-01-21 18:42:33.818 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:42:35 compute-0 nova_compute[183278]: 2026-01-21 18:42:35.452 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:35 compute-0 nova_compute[183278]: 2026-01-21 18:42:35.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:42:36 compute-0 nova_compute[183278]: 2026-01-21 18:42:36.700 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:37 compute-0 podman[213507]: 2026-01-21 18:42:37.984194626 +0000 UTC m=+0.044461966 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:42:38 compute-0 nova_compute[183278]: 2026-01-21 18:42:38.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.257 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.257 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.257 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.257 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.469 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.470 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5822MB free_disk=73.37879943847656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.470 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.470 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.566 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.567 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.585 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.602 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.603 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:42:39 compute-0 nova_compute[183278]: 2026-01-21 18:42:39.604 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:42:40 compute-0 nova_compute[183278]: 2026-01-21 18:42:40.496 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:40 compute-0 nova_compute[183278]: 2026-01-21 18:42:40.604 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:42:40 compute-0 nova_compute[183278]: 2026-01-21 18:42:40.604 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:42:41 compute-0 nova_compute[183278]: 2026-01-21 18:42:41.702 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:45 compute-0 nova_compute[183278]: 2026-01-21 18:42:45.545 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:45 compute-0 nova_compute[183278]: 2026-01-21 18:42:45.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:42:45 compute-0 nova_compute[183278]: 2026-01-21 18:42:45.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:42:46 compute-0 nova_compute[183278]: 2026-01-21 18:42:46.703 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:50 compute-0 nova_compute[183278]: 2026-01-21 18:42:50.547 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:51 compute-0 nova_compute[183278]: 2026-01-21 18:42:51.704 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:55 compute-0 ovn_controller[95419]: 2026-01-21T18:42:55Z|00205|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 18:42:55 compute-0 nova_compute[183278]: 2026-01-21 18:42:55.549 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:56 compute-0 podman[213530]: 2026-01-21 18:42:56.007194757 +0000 UTC m=+0.063937267 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 18:42:56 compute-0 nova_compute[183278]: 2026-01-21 18:42:56.706 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:42:59 compute-0 podman[192560]: time="2026-01-21T18:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:42:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:42:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 21 18:43:00 compute-0 nova_compute[183278]: 2026-01-21 18:43:00.577 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:01 compute-0 openstack_network_exporter[195402]: ERROR   18:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:43:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:43:01 compute-0 openstack_network_exporter[195402]: ERROR   18:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:43:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:43:01 compute-0 nova_compute[183278]: 2026-01-21 18:43:01.707 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:03 compute-0 podman[213553]: 2026-01-21 18:43:03.003290512 +0000 UTC m=+0.052531201 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:43:03 compute-0 podman[213552]: 2026-01-21 18:43:03.023561861 +0000 UTC m=+0.077836842 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 18:43:05 compute-0 nova_compute[183278]: 2026-01-21 18:43:05.580 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:06 compute-0 nova_compute[183278]: 2026-01-21 18:43:06.709 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:08 compute-0 podman[213597]: 2026-01-21 18:43:08.988535069 +0000 UTC m=+0.045422590 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:43:10 compute-0 nova_compute[183278]: 2026-01-21 18:43:10.582 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:11 compute-0 nova_compute[183278]: 2026-01-21 18:43:11.710 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:15 compute-0 nova_compute[183278]: 2026-01-21 18:43:15.625 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:16 compute-0 nova_compute[183278]: 2026-01-21 18:43:16.711 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:43:20.115 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:43:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:43:20.116 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:43:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:43:20.116 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:43:20 compute-0 nova_compute[183278]: 2026-01-21 18:43:20.672 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:21 compute-0 nova_compute[183278]: 2026-01-21 18:43:21.777 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:25 compute-0 nova_compute[183278]: 2026-01-21 18:43:25.674 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:26 compute-0 nova_compute[183278]: 2026-01-21 18:43:26.778 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:27 compute-0 podman[213622]: 2026-01-21 18:43:27.007257739 +0000 UTC m=+0.059629423 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 18:43:28 compute-0 nova_compute[183278]: 2026-01-21 18:43:28.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:28 compute-0 nova_compute[183278]: 2026-01-21 18:43:28.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:43:28 compute-0 nova_compute[183278]: 2026-01-21 18:43:28.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:43:28 compute-0 nova_compute[183278]: 2026-01-21 18:43:28.839 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:43:29 compute-0 podman[192560]: time="2026-01-21T18:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:43:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:43:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 21 18:43:30 compute-0 nova_compute[183278]: 2026-01-21 18:43:30.675 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:31 compute-0 openstack_network_exporter[195402]: ERROR   18:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:43:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:43:31 compute-0 openstack_network_exporter[195402]: ERROR   18:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:43:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:43:31 compute-0 nova_compute[183278]: 2026-01-21 18:43:31.780 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:32 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:43:32.848 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:43:32 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:43:32.849 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:43:32 compute-0 nova_compute[183278]: 2026-01-21 18:43:32.849 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:33 compute-0 nova_compute[183278]: 2026-01-21 18:43:33.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:33 compute-0 nova_compute[183278]: 2026-01-21 18:43:33.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:34 compute-0 podman[213645]: 2026-01-21 18:43:34.015463926 +0000 UTC m=+0.073636191 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:43:34 compute-0 podman[213644]: 2026-01-21 18:43:34.019261658 +0000 UTC m=+0.080670381 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 21 18:43:34 compute-0 nova_compute[183278]: 2026-01-21 18:43:34.392 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:34 compute-0 nova_compute[183278]: 2026-01-21 18:43:34.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:35 compute-0 nova_compute[183278]: 2026-01-21 18:43:35.678 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:36 compute-0 nova_compute[183278]: 2026-01-21 18:43:36.782 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:36 compute-0 nova_compute[183278]: 2026-01-21 18:43:36.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.842 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.842 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.843 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.843 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.978 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.978 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5836MB free_disk=73.3789176940918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.979 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:43:39 compute-0 nova_compute[183278]: 2026-01-21 18:43:39.979 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:43:39 compute-0 podman[213688]: 2026-01-21 18:43:39.998647525 +0000 UTC m=+0.052084650 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.033 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.033 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.047 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing inventories for resource provider 502e4243-611b-433d-a766-9b485d51652d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.064 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating ProviderTree inventory for provider 502e4243-611b-433d-a766-9b485d51652d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.064 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Updating inventory in ProviderTree for provider 502e4243-611b-433d-a766-9b485d51652d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.076 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing aggregate associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.098 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Refreshing trait associations for resource provider 502e4243-611b-433d-a766-9b485d51652d, traits: COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.125 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.165 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.167 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.167 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:43:40 compute-0 nova_compute[183278]: 2026-01-21 18:43:40.681 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:41 compute-0 nova_compute[183278]: 2026-01-21 18:43:41.783 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:41 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:43:41.850 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:43:42 compute-0 nova_compute[183278]: 2026-01-21 18:43:42.168 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:44 compute-0 nova_compute[183278]: 2026-01-21 18:43:44.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:45 compute-0 nova_compute[183278]: 2026-01-21 18:43:45.685 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:45 compute-0 nova_compute[183278]: 2026-01-21 18:43:45.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:43:45 compute-0 nova_compute[183278]: 2026-01-21 18:43:45.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:43:46 compute-0 nova_compute[183278]: 2026-01-21 18:43:46.785 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:50 compute-0 nova_compute[183278]: 2026-01-21 18:43:50.687 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:51 compute-0 nova_compute[183278]: 2026-01-21 18:43:51.787 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:55 compute-0 nova_compute[183278]: 2026-01-21 18:43:55.690 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:56 compute-0 nova_compute[183278]: 2026-01-21 18:43:56.789 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:43:58 compute-0 podman[213713]: 2026-01-21 18:43:58.008289062 +0000 UTC m=+0.065466613 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Jan 21 18:43:59 compute-0 podman[192560]: time="2026-01-21T18:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:43:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:43:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2183 "" "Go-http-client/1.1"
Jan 21 18:44:00 compute-0 nova_compute[183278]: 2026-01-21 18:44:00.693 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:01 compute-0 openstack_network_exporter[195402]: ERROR   18:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:44:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:44:01 compute-0 openstack_network_exporter[195402]: ERROR   18:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:44:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:44:01 compute-0 nova_compute[183278]: 2026-01-21 18:44:01.822 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:04 compute-0 podman[213735]: 2026-01-21 18:44:04.985721076 +0000 UTC m=+0.042276643 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 18:44:05 compute-0 podman[213734]: 2026-01-21 18:44:05.030413397 +0000 UTC m=+0.081223485 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 18:44:05 compute-0 nova_compute[183278]: 2026-01-21 18:44:05.695 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:06 compute-0 nova_compute[183278]: 2026-01-21 18:44:06.823 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:10 compute-0 ovn_controller[95419]: 2026-01-21T18:44:10Z|00206|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 18:44:10 compute-0 nova_compute[183278]: 2026-01-21 18:44:10.698 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:10 compute-0 podman[213780]: 2026-01-21 18:44:10.993484719 +0000 UTC m=+0.047660993 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:44:11 compute-0 nova_compute[183278]: 2026-01-21 18:44:11.825 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:13 compute-0 nova_compute[183278]: 2026-01-21 18:44:13.939 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:13 compute-0 nova_compute[183278]: 2026-01-21 18:44:13.939 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:13 compute-0 nova_compute[183278]: 2026-01-21 18:44:13.957 183284 DEBUG nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.038 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.038 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.047 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.047 183284 INFO nova.compute.claims [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Claim successful on node compute-0.ctlplane.example.com
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.186 183284 DEBUG nova.compute.provider_tree [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.201 183284 DEBUG nova.scheduler.client.report [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.223 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.224 183284 DEBUG nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.265 183284 DEBUG nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.265 183284 DEBUG nova.network.neutron [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.286 183284 INFO nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.304 183284 DEBUG nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.403 183284 DEBUG nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.404 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.404 183284 INFO nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Creating image(s)
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.405 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Acquiring lock "/var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.405 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "/var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.406 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "/var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.418 183284 DEBUG nova.policy [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '822dc3e9895a43a190ab7f3b466742b3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ede9c1725ac4e8ea38db9268265acb5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.421 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.505 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.506 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Acquiring lock "8bf45782a806e8f2f684ae874be9ab99d891a685" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.506 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.517 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.576 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.577 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.607 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685,backing_fmt=raw /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.608 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "8bf45782a806e8f2f684ae874be9ab99d891a685" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.608 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.664 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.665 183284 DEBUG nova.virt.disk.api [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Checking if we can resize image /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.665 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.736 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.738 183284 DEBUG nova.virt.disk.api [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Cannot resize image /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.738 183284 DEBUG nova.objects.instance [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lazy-loading 'migration_context' on Instance uuid c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.761 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.762 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Ensure instance console log exists: /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.763 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.763 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:14 compute-0 nova_compute[183278]: 2026-01-21 18:44:14.763 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:15 compute-0 nova_compute[183278]: 2026-01-21 18:44:15.472 183284 DEBUG nova.network.neutron [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Successfully created port: e14186c1-07e6-4057-97ee-4c005c167ad0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 18:44:15 compute-0 nova_compute[183278]: 2026-01-21 18:44:15.700 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:16 compute-0 nova_compute[183278]: 2026-01-21 18:44:16.826 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:16 compute-0 nova_compute[183278]: 2026-01-21 18:44:16.988 183284 DEBUG nova.network.neutron [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Successfully updated port: e14186c1-07e6-4057-97ee-4c005c167ad0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.005 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Acquiring lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.006 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Acquired lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.006 183284 DEBUG nova.network.neutron [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.079 183284 DEBUG nova.compute.manager [req-3da047c1-81a0-45d6-b3df-d3a92da09151 req-b3ea318b-a5f3-41f0-a06a-7c35e298caed 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-changed-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.079 183284 DEBUG nova.compute.manager [req-3da047c1-81a0-45d6-b3df-d3a92da09151 req-b3ea318b-a5f3-41f0-a06a-7c35e298caed 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Refreshing instance network info cache due to event network-changed-e14186c1-07e6-4057-97ee-4c005c167ad0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.080 183284 DEBUG oslo_concurrency.lockutils [req-3da047c1-81a0-45d6-b3df-d3a92da09151 req-b3ea318b-a5f3-41f0-a06a-7c35e298caed 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.130 183284 DEBUG nova.network.neutron [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.913 183284 DEBUG nova.network.neutron [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Updating instance_info_cache with network_info: [{"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.940 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Releasing lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.941 183284 DEBUG nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Instance network_info: |[{"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.941 183284 DEBUG oslo_concurrency.lockutils [req-3da047c1-81a0-45d6-b3df-d3a92da09151 req-b3ea318b-a5f3-41f0-a06a-7c35e298caed 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.942 183284 DEBUG nova.network.neutron [req-3da047c1-81a0-45d6-b3df-d3a92da09151 req-b3ea318b-a5f3-41f0-a06a-7c35e298caed 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Refreshing network info cache for port e14186c1-07e6-4057-97ee-4c005c167ad0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.945 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Start _get_guest_xml network_info=[{"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'size': 0, 'device_type': 'disk', 'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'image_id': '672306ae-5521-4fc1-a825-a16d6d125c61'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.949 183284 WARNING nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.954 183284 DEBUG nova.virt.libvirt.host [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.955 183284 DEBUG nova.virt.libvirt.host [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.960 183284 DEBUG nova.virt.libvirt.host [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.961 183284 DEBUG nova.virt.libvirt.host [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.962 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.962 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T18:09:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='45095fe9-3fd5-4f1f-87b2-a2a8292135a2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T18:09:31Z,direct_url=<?>,disk_format='qcow2',id=672306ae-5521-4fc1-a825-a16d6d125c61,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='05bd8d3790e24414a90ecf55ee1939e1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T18:09:33Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.962 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.963 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.963 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.963 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.963 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.964 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.964 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.964 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.964 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.965 183284 DEBUG nova.virt.hardware [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.968 183284 DEBUG nova.virt.libvirt.vif [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:44:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-417289613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-417289613',id=29,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ede9c1725ac4e8ea38db9268265acb5',ramdisk_id='',reservation_id='r-72t15kiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1786177469',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1786177469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:44:14Z,user_data=None,user_id='822dc3e9895a43a190ab7f3b466742b3',uuid=c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.968 183284 DEBUG nova.network.os_vif_util [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Converting VIF {"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.969 183284 DEBUG nova.network.os_vif_util [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:52:84,bridge_name='br-int',has_traffic_filtering=True,id=e14186c1-07e6-4057-97ee-4c005c167ad0,network=Network(9c57fa57-0050-4bf4-8378-976d31aaf23b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14186c1-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.970 183284 DEBUG nova.objects.instance [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.984 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <uuid>c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2</uuid>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <name>instance-0000001d</name>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <memory>131072</memory>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <vcpu>1</vcpu>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <metadata>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-417289613</nova:name>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <nova:creationTime>2026-01-21 18:44:17</nova:creationTime>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <nova:flavor name="m1.nano">
Jan 21 18:44:17 compute-0 nova_compute[183278]:         <nova:memory>128</nova:memory>
Jan 21 18:44:17 compute-0 nova_compute[183278]:         <nova:disk>1</nova:disk>
Jan 21 18:44:17 compute-0 nova_compute[183278]:         <nova:swap>0</nova:swap>
Jan 21 18:44:17 compute-0 nova_compute[183278]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:44:17 compute-0 nova_compute[183278]:         <nova:vcpus>1</nova:vcpus>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       </nova:flavor>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <nova:owner>
Jan 21 18:44:17 compute-0 nova_compute[183278]:         <nova:user uuid="822dc3e9895a43a190ab7f3b466742b3">tempest-TestExecuteZoneMigrationStrategy-1786177469-project-member</nova:user>
Jan 21 18:44:17 compute-0 nova_compute[183278]:         <nova:project uuid="6ede9c1725ac4e8ea38db9268265acb5">tempest-TestExecuteZoneMigrationStrategy-1786177469</nova:project>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       </nova:owner>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <nova:root type="image" uuid="672306ae-5521-4fc1-a825-a16d6d125c61"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <nova:ports>
Jan 21 18:44:17 compute-0 nova_compute[183278]:         <nova:port uuid="e14186c1-07e6-4057-97ee-4c005c167ad0">
Jan 21 18:44:17 compute-0 nova_compute[183278]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:         </nova:port>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       </nova:ports>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     </nova:instance>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   </metadata>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <sysinfo type="smbios">
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <system>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <entry name="manufacturer">RDO</entry>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <entry name="product">OpenStack Compute</entry>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <entry name="serial">c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2</entry>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <entry name="uuid">c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2</entry>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <entry name="family">Virtual Machine</entry>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     </system>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   </sysinfo>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <os>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <boot dev="hd"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <smbios mode="sysinfo"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   </os>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <features>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <acpi/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <apic/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <vmcoreinfo/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   </features>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <clock offset="utc">
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <timer name="hpet" present="no"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   </clock>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <cpu mode="custom" match="exact">
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <model>Nehalem</model>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   </cpu>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   <devices>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <disk type="file" device="disk">
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <target dev="vda" bus="virtio"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <disk type="file" device="cdrom">
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <source file="/var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk.config"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <target dev="sda" bus="sata"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     </disk>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <interface type="ethernet">
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <mac address="fa:16:3e:4a:52:84"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <mtu size="1442"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <target dev="tape14186c1-07"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     </interface>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <serial type="pty">
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <log file="/var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/console.log" append="off"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     </serial>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <video>
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <model type="virtio"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     </video>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <input type="tablet" bus="usb"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <rng model="virtio">
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <backend model="random">/dev/urandom</backend>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     </rng>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <controller type="usb" index="0"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     <memballoon model="virtio">
Jan 21 18:44:17 compute-0 nova_compute[183278]:       <stats period="10"/>
Jan 21 18:44:17 compute-0 nova_compute[183278]:     </memballoon>
Jan 21 18:44:17 compute-0 nova_compute[183278]:   </devices>
Jan 21 18:44:17 compute-0 nova_compute[183278]: </domain>
Jan 21 18:44:17 compute-0 nova_compute[183278]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.985 183284 DEBUG nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Preparing to wait for external event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.985 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.985 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.986 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.986 183284 DEBUG nova.virt.libvirt.vif [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T18:44:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-417289613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-417289613',id=29,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ede9c1725ac4e8ea38db9268265acb5',ramdisk_id='',reservation_id='r-72t15kiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1786177469',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1786177469-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T18:44:14Z,user_data=None,user_id='822dc3e9895a43a190ab7f3b466742b3',uuid=c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.987 183284 DEBUG nova.network.os_vif_util [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Converting VIF {"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.987 183284 DEBUG nova.network.os_vif_util [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:52:84,bridge_name='br-int',has_traffic_filtering=True,id=e14186c1-07e6-4057-97ee-4c005c167ad0,network=Network(9c57fa57-0050-4bf4-8378-976d31aaf23b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14186c1-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.988 183284 DEBUG os_vif [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:52:84,bridge_name='br-int',has_traffic_filtering=True,id=e14186c1-07e6-4057-97ee-4c005c167ad0,network=Network(9c57fa57-0050-4bf4-8378-976d31aaf23b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14186c1-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.988 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.988 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.989 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.991 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.991 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape14186c1-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.991 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape14186c1-07, col_values=(('external_ids', {'iface-id': 'e14186c1-07e6-4057-97ee-4c005c167ad0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:52:84', 'vm-uuid': 'c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.993 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:17 compute-0 NetworkManager[55506]: <info>  [1769021057.9946] manager: (tape14186c1-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.995 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 18:44:17 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.999 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:17.999 183284 INFO os_vif [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:52:84,bridge_name='br-int',has_traffic_filtering=True,id=e14186c1-07e6-4057-97ee-4c005c167ad0,network=Network(9c57fa57-0050-4bf4-8378-976d31aaf23b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14186c1-07')
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.052 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.053 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.053 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] No VIF found with MAC fa:16:3e:4a:52:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.053 183284 INFO nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Using config drive
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.396 183284 INFO nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Creating config drive at /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk.config
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.404 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp22ukz7vh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.527 183284 DEBUG oslo_concurrency.processutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp22ukz7vh" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:44:18 compute-0 kernel: tape14186c1-07: entered promiscuous mode
Jan 21 18:44:18 compute-0 NetworkManager[55506]: <info>  [1769021058.6062] manager: (tape14186c1-07): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Jan 21 18:44:18 compute-0 ovn_controller[95419]: 2026-01-21T18:44:18Z|00207|binding|INFO|Claiming lport e14186c1-07e6-4057-97ee-4c005c167ad0 for this chassis.
Jan 21 18:44:18 compute-0 ovn_controller[95419]: 2026-01-21T18:44:18Z|00208|binding|INFO|e14186c1-07e6-4057-97ee-4c005c167ad0: Claiming fa:16:3e:4a:52:84 10.100.0.8
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.606 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.610 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:18 compute-0 systemd-udevd[213838]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:44:18 compute-0 systemd-machined[154592]: New machine qemu-20-instance-0000001d.
Jan 21 18:44:18 compute-0 NetworkManager[55506]: <info>  [1769021058.6595] device (tape14186c1-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:44:18 compute-0 NetworkManager[55506]: <info>  [1769021058.6607] device (tape14186c1-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.663 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:18 compute-0 ovn_controller[95419]: 2026-01-21T18:44:18Z|00209|binding|INFO|Setting lport e14186c1-07e6-4057-97ee-4c005c167ad0 ovn-installed in OVS
Jan 21 18:44:18 compute-0 nova_compute[183278]: 2026-01-21 18:44:18.669 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:18 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001d.
Jan 21 18:44:18 compute-0 ovn_controller[95419]: 2026-01-21T18:44:18Z|00210|binding|INFO|Setting lport e14186c1-07e6-4057-97ee-4c005c167ad0 up in Southbound
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.731 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:52:84 10.100.0.8'], port_security=['fa:16:3e:4a:52:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c57fa57-0050-4bf4-8378-976d31aaf23b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ede9c1725ac4e8ea38db9268265acb5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3b8e32c8-3299-4f4d-8345-c78bc6e7466e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9133fe9e-7d02-4b64-9125-410a8dad613d, chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=e14186c1-07e6-4057-97ee-4c005c167ad0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.732 104698 INFO neutron.agent.ovn.metadata.agent [-] Port e14186c1-07e6-4057-97ee-4c005c167ad0 in datapath 9c57fa57-0050-4bf4-8378-976d31aaf23b bound to our chassis
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.733 104698 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c57fa57-0050-4bf4-8378-976d31aaf23b
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.745 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[d004a4a8-360c-4620-b931-705ff343e86f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.746 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c57fa57-01 in ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.748 203892 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c57fa57-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.748 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5a258d10-ef12-4d7d-9b5f-3b7379f6b22f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.749 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[8cff3df0-11db-40a1-93ce-ad03b9202f4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.758 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb4f62b-90fd-417d-8b1f-ffbc5c6ceae9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.781 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e867f7e2-cf50-495d-9f0e-ed7150ff6a30]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.808 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[15f37fe6-19fe-423e-9c2a-f6ba16989229]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.815 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0396b6-8e0c-4d2d-88a3-b4b00462c821]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 systemd-udevd[213841]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:44:18 compute-0 NetworkManager[55506]: <info>  [1769021058.8163] manager: (tap9c57fa57-00): new Veth device (/org/freedesktop/NetworkManager/Devices/90)
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.846 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5eb53e-a66c-4815-b54c-e56906082aa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.850 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[da8d8afd-6bbd-45f4-869f-7237162806c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 NetworkManager[55506]: <info>  [1769021058.8732] device (tap9c57fa57-00): carrier: link connected
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.879 203906 DEBUG oslo.privsep.daemon [-] privsep: reply[2384feab-6578-4914-ab27-1af374adf4f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.896 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[bc31bbe8-7fb6-4132-b59c-6d9c7f8fc227]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c57fa57-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:16:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562248, 'reachable_time': 19879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213872, 'error': None, 'target': 'ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.910 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[230391fc-e1db-4667-9b4f-f4cab7541e32]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:16f6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 562248, 'tstamp': 562248}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213873, 'error': None, 'target': 'ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.926 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[61ca1ad8-06fd-431d-a79c-eaee59cf045c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c57fa57-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:16:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562248, 'reachable_time': 19879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213874, 'error': None, 'target': 'ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:18 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:18.958 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[9f233d01-77b6-4703-8548-0fc76b4caa4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:19.025 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e715f8e9-fa3b-4576-9084-0000a8c05585]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:19.027 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c57fa57-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:19.027 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:19.028 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c57fa57-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.030 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:19 compute-0 NetworkManager[55506]: <info>  [1769021059.0307] manager: (tap9c57fa57-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 21 18:44:19 compute-0 kernel: tap9c57fa57-00: entered promiscuous mode
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.031 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:19.034 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c57fa57-00, col_values=(('external_ids', {'iface-id': '9bd47032-c426-48fa-a15e-e1952f4445a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.035 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:19 compute-0 ovn_controller[95419]: 2026-01-21T18:44:19Z|00211|binding|INFO|Releasing lport 9bd47032-c426-48fa-a15e-e1952f4445a3 from this chassis (sb_readonly=0)
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.035 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:19.036 104698 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c57fa57-0050-4bf4-8378-976d31aaf23b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c57fa57-0050-4bf4-8378-976d31aaf23b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:19.037 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ae54a6-8d20-4910-be29-d946ca01c3f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:19.038 104698 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: global
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     log         /dev/log local0 debug
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     log-tag     haproxy-metadata-proxy-9c57fa57-0050-4bf4-8378-976d31aaf23b
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     user        root
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     group       root
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     maxconn     1024
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     pidfile     /var/lib/neutron/external/pids/9c57fa57-0050-4bf4-8378-976d31aaf23b.pid.haproxy
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     daemon
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: defaults
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     log global
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     mode http
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     option httplog
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     option dontlognull
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     option http-server-close
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     option forwardfor
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     retries                 3
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     timeout http-request    30s
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     timeout connect         30s
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     timeout client          32s
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     timeout server          32s
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     timeout http-keep-alive 30s
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: listen listener
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     bind 169.254.169.254:80
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:     http-request add-header X-OVN-Network-ID 9c57fa57-0050-4bf4-8378-976d31aaf23b
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 18:44:19 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:19.040 104698 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b', 'env', 'PROCESS_TAG=haproxy-9c57fa57-0050-4bf4-8378-976d31aaf23b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c57fa57-0050-4bf4-8378-976d31aaf23b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.049 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.083 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769021059.0822053, c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.083 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] VM Started (Lifecycle Event)
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.108 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.112 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769021059.0826485, c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.112 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] VM Paused (Lifecycle Event)
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.133 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.136 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.164 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.221 183284 DEBUG nova.compute.manager [req-fd3a691f-2409-40a8-acde-8e06d8d4224e req-e5dac706-8de7-4ed2-bce8-56c7ec9e0418 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.221 183284 DEBUG oslo_concurrency.lockutils [req-fd3a691f-2409-40a8-acde-8e06d8d4224e req-e5dac706-8de7-4ed2-bce8-56c7ec9e0418 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.222 183284 DEBUG oslo_concurrency.lockutils [req-fd3a691f-2409-40a8-acde-8e06d8d4224e req-e5dac706-8de7-4ed2-bce8-56c7ec9e0418 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.222 183284 DEBUG oslo_concurrency.lockutils [req-fd3a691f-2409-40a8-acde-8e06d8d4224e req-e5dac706-8de7-4ed2-bce8-56c7ec9e0418 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.222 183284 DEBUG nova.compute.manager [req-fd3a691f-2409-40a8-acde-8e06d8d4224e req-e5dac706-8de7-4ed2-bce8-56c7ec9e0418 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Processing event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.223 183284 DEBUG nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.226 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769021059.2261243, c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.226 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] VM Resumed (Lifecycle Event)
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.228 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.231 183284 INFO nova.virt.libvirt.driver [-] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Instance spawned successfully.
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.231 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.255 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.256 183284 DEBUG nova.network.neutron [req-3da047c1-81a0-45d6-b3df-d3a92da09151 req-b3ea318b-a5f3-41f0-a06a-7c35e298caed 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Updated VIF entry in instance network info cache for port e14186c1-07e6-4057-97ee-4c005c167ad0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.257 183284 DEBUG nova.network.neutron [req-3da047c1-81a0-45d6-b3df-d3a92da09151 req-b3ea318b-a5f3-41f0-a06a-7c35e298caed 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Updating instance_info_cache with network_info: [{"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.262 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.263 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.263 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.263 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.264 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.264 183284 DEBUG nova.virt.libvirt.driver [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.269 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:44:19 compute-0 podman[213913]: 2026-01-21 18:44:19.415658907 +0000 UTC m=+0.047712674 container create cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:44:19 compute-0 systemd[1]: Started libpod-conmon-cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98.scope.
Jan 21 18:44:19 compute-0 systemd[1]: Started libcrun container.
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.476 183284 DEBUG oslo_concurrency.lockutils [req-3da047c1-81a0-45d6-b3df-d3a92da09151 req-b3ea318b-a5f3-41f0-a06a-7c35e298caed 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:44:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1078ead15b21aaf98f835bb3f762847d931b4aeb0b8d6adf94927f6401742fc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:44:19 compute-0 podman[213913]: 2026-01-21 18:44:19.389817703 +0000 UTC m=+0.021871490 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:44:19 compute-0 podman[213913]: 2026-01-21 18:44:19.4898231 +0000 UTC m=+0.121876887 container init cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 21 18:44:19 compute-0 podman[213913]: 2026-01-21 18:44:19.496942022 +0000 UTC m=+0.128995789 container start cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.505 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 18:44:19 compute-0 neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b[213928]: [NOTICE]   (213932) : New worker (213934) forked
Jan 21 18:44:19 compute-0 neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b[213928]: [NOTICE]   (213932) : Loading success.
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.536 183284 INFO nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Took 5.13 seconds to spawn the instance on the hypervisor.
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.537 183284 DEBUG nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.629 183284 INFO nova.compute.manager [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Took 5.62 seconds to build instance.
Jan 21 18:44:19 compute-0 nova_compute[183278]: 2026-01-21 18:44:19.696 183284 DEBUG oslo_concurrency.lockutils [None req-662ad032-36c0-4ebc-939f-05a1aff78e97 822dc3e9895a43a190ab7f3b466742b3 6ede9c1725ac4e8ea38db9268265acb5 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:20.117 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:20.118 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:20.119 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:21 compute-0 nova_compute[183278]: 2026-01-21 18:44:21.713 183284 DEBUG nova.compute.manager [req-df9bbbf3-0e99-45af-b4d2-00f5eb49792b req-19af6f63-847c-4f90-ab2b-f0bd4f879b13 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:21 compute-0 nova_compute[183278]: 2026-01-21 18:44:21.713 183284 DEBUG oslo_concurrency.lockutils [req-df9bbbf3-0e99-45af-b4d2-00f5eb49792b req-19af6f63-847c-4f90-ab2b-f0bd4f879b13 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:21 compute-0 nova_compute[183278]: 2026-01-21 18:44:21.714 183284 DEBUG oslo_concurrency.lockutils [req-df9bbbf3-0e99-45af-b4d2-00f5eb49792b req-19af6f63-847c-4f90-ab2b-f0bd4f879b13 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:21 compute-0 nova_compute[183278]: 2026-01-21 18:44:21.714 183284 DEBUG oslo_concurrency.lockutils [req-df9bbbf3-0e99-45af-b4d2-00f5eb49792b req-19af6f63-847c-4f90-ab2b-f0bd4f879b13 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:21 compute-0 nova_compute[183278]: 2026-01-21 18:44:21.714 183284 DEBUG nova.compute.manager [req-df9bbbf3-0e99-45af-b4d2-00f5eb49792b req-19af6f63-847c-4f90-ab2b-f0bd4f879b13 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] No waiting events found dispatching network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:44:21 compute-0 nova_compute[183278]: 2026-01-21 18:44:21.714 183284 WARNING nova.compute.manager [req-df9bbbf3-0e99-45af-b4d2-00f5eb49792b req-19af6f63-847c-4f90-ab2b-f0bd4f879b13 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received unexpected event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 for instance with vm_state active and task_state None.
Jan 21 18:44:21 compute-0 nova_compute[183278]: 2026-01-21 18:44:21.828 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:22 compute-0 nova_compute[183278]: 2026-01-21 18:44:22.994 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:26 compute-0 nova_compute[183278]: 2026-01-21 18:44:26.830 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:27 compute-0 nova_compute[183278]: 2026-01-21 18:44:27.997 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:28 compute-0 nova_compute[183278]: 2026-01-21 18:44:28.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:28 compute-0 nova_compute[183278]: 2026-01-21 18:44:28.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:44:28 compute-0 nova_compute[183278]: 2026-01-21 18:44:28.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:44:28 compute-0 nova_compute[183278]: 2026-01-21 18:44:28.957 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:44:28 compute-0 nova_compute[183278]: 2026-01-21 18:44:28.957 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquired lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:44:28 compute-0 nova_compute[183278]: 2026-01-21 18:44:28.958 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 18:44:28 compute-0 nova_compute[183278]: 2026-01-21 18:44:28.958 183284 DEBUG nova.objects.instance [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lazy-loading 'info_cache' on Instance uuid c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:44:29 compute-0 podman[213943]: 2026-01-21 18:44:29.007431859 +0000 UTC m=+0.061338264 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350)
Jan 21 18:44:29 compute-0 podman[192560]: time="2026-01-21T18:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:44:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16587 "" "Go-http-client/1.1"
Jan 21 18:44:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2646 "" "Go-http-client/1.1"
Jan 21 18:44:30 compute-0 nova_compute[183278]: 2026-01-21 18:44:30.281 183284 DEBUG nova.network.neutron [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Updating instance_info_cache with network_info: [{"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:44:30 compute-0 nova_compute[183278]: 2026-01-21 18:44:30.348 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Releasing lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:44:30 compute-0 nova_compute[183278]: 2026-01-21 18:44:30.348 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 18:44:30 compute-0 nova_compute[183278]: 2026-01-21 18:44:30.424 183284 DEBUG nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Check if temp file /var/lib/nova/instances/tmpy8wweh4t exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 18:44:30 compute-0 nova_compute[183278]: 2026-01-21 18:44:30.425 183284 DEBUG nova.compute.manager [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy8wweh4t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 18:44:31 compute-0 nova_compute[183278]: 2026-01-21 18:44:31.204 183284 DEBUG oslo_concurrency.processutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:44:31 compute-0 nova_compute[183278]: 2026-01-21 18:44:31.259 183284 DEBUG oslo_concurrency.processutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:44:31 compute-0 nova_compute[183278]: 2026-01-21 18:44:31.260 183284 DEBUG oslo_concurrency.processutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 18:44:31 compute-0 nova_compute[183278]: 2026-01-21 18:44:31.321 183284 DEBUG oslo_concurrency.processutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 18:44:31 compute-0 openstack_network_exporter[195402]: ERROR   18:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:44:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:44:31 compute-0 openstack_network_exporter[195402]: ERROR   18:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:44:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:44:31 compute-0 nova_compute[183278]: 2026-01-21 18:44:31.833 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:32 compute-0 ovn_controller[95419]: 2026-01-21T18:44:32Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4a:52:84 10.100.0.8
Jan 21 18:44:32 compute-0 ovn_controller[95419]: 2026-01-21T18:44:32Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4a:52:84 10.100.0.8
Jan 21 18:44:32 compute-0 sshd-session[213990]: Accepted publickey for nova from 192.168.122.101 port 49696 ssh2: ECDSA SHA256:29a5JNhHHz2bb0ACqZTr6qOKeSRnhiTRA8SK+rzn9gs
Jan 21 18:44:32 compute-0 systemd-logind[782]: New session 44 of user nova.
Jan 21 18:44:32 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:44:32 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:44:33 compute-0 nova_compute[183278]: 2026-01-21 18:44:32.999 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:33 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:44:33 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:44:33 compute-0 systemd[213994]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:44:33 compute-0 systemd[213994]: Queued start job for default target Main User Target.
Jan 21 18:44:33 compute-0 systemd[213994]: Created slice User Application Slice.
Jan 21 18:44:33 compute-0 systemd[213994]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:44:33 compute-0 systemd[213994]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:44:33 compute-0 systemd[213994]: Reached target Paths.
Jan 21 18:44:33 compute-0 systemd[213994]: Reached target Timers.
Jan 21 18:44:33 compute-0 systemd[213994]: Starting D-Bus User Message Bus Socket...
Jan 21 18:44:33 compute-0 systemd[213994]: Starting Create User's Volatile Files and Directories...
Jan 21 18:44:33 compute-0 systemd[213994]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:44:33 compute-0 systemd[213994]: Reached target Sockets.
Jan 21 18:44:33 compute-0 systemd[213994]: Finished Create User's Volatile Files and Directories.
Jan 21 18:44:33 compute-0 systemd[213994]: Reached target Basic System.
Jan 21 18:44:33 compute-0 systemd[213994]: Reached target Main User Target.
Jan 21 18:44:33 compute-0 systemd[213994]: Startup finished in 114ms.
Jan 21 18:44:33 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:44:33 compute-0 systemd[1]: Started Session 44 of User nova.
Jan 21 18:44:33 compute-0 sshd-session[213990]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 18:44:33 compute-0 sshd-session[214009]: Received disconnect from 192.168.122.101 port 49696:11: disconnected by user
Jan 21 18:44:33 compute-0 sshd-session[214009]: Disconnected from user nova 192.168.122.101 port 49696
Jan 21 18:44:33 compute-0 sshd-session[213990]: pam_unix(sshd:session): session closed for user nova
Jan 21 18:44:33 compute-0 systemd[1]: session-44.scope: Deactivated successfully.
Jan 21 18:44:33 compute-0 systemd-logind[782]: Session 44 logged out. Waiting for processes to exit.
Jan 21 18:44:33 compute-0 systemd-logind[782]: Removed session 44.
Jan 21 18:44:33 compute-0 nova_compute[183278]: 2026-01-21 18:44:33.894 183284 DEBUG nova.compute.manager [req-962ff6e8-c23f-4f77-a86a-3cf8141eac1a req-f8e4f844-9eed-45c2-9416-82cac32069d2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-unplugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:33 compute-0 nova_compute[183278]: 2026-01-21 18:44:33.895 183284 DEBUG oslo_concurrency.lockutils [req-962ff6e8-c23f-4f77-a86a-3cf8141eac1a req-f8e4f844-9eed-45c2-9416-82cac32069d2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:33 compute-0 nova_compute[183278]: 2026-01-21 18:44:33.896 183284 DEBUG oslo_concurrency.lockutils [req-962ff6e8-c23f-4f77-a86a-3cf8141eac1a req-f8e4f844-9eed-45c2-9416-82cac32069d2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:33 compute-0 nova_compute[183278]: 2026-01-21 18:44:33.896 183284 DEBUG oslo_concurrency.lockutils [req-962ff6e8-c23f-4f77-a86a-3cf8141eac1a req-f8e4f844-9eed-45c2-9416-82cac32069d2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:33 compute-0 nova_compute[183278]: 2026-01-21 18:44:33.896 183284 DEBUG nova.compute.manager [req-962ff6e8-c23f-4f77-a86a-3cf8141eac1a req-f8e4f844-9eed-45c2-9416-82cac32069d2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] No waiting events found dispatching network-vif-unplugged-e14186c1-07e6-4057-97ee-4c005c167ad0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:44:33 compute-0 nova_compute[183278]: 2026-01-21 18:44:33.896 183284 DEBUG nova.compute.manager [req-962ff6e8-c23f-4f77-a86a-3cf8141eac1a req-f8e4f844-9eed-45c2-9416-82cac32069d2 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-unplugged-e14186c1-07e6-4057-97ee-4c005c167ad0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.343 183284 INFO nova.compute.manager [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Took 3.02 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.343 183284 DEBUG nova.compute.manager [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.361 183284 DEBUG nova.compute.manager [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy8wweh4t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(a52012a1-9293-4578-a442-fcb6740ebc5a),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.390 183284 DEBUG nova.objects.instance [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lazy-loading 'migration_context' on Instance uuid c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.391 183284 DEBUG nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.392 183284 DEBUG nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.392 183284 DEBUG nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.408 183284 DEBUG nova.virt.libvirt.vif [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:44:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-417289613',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-417289613',id=29,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:44:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6ede9c1725ac4e8ea38db9268265acb5',ramdisk_id='',reservation_id='r-72t15kiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1786177469',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1786177469-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:44:19Z,user_data=None,user_id='822dc3e9895a43a190ab7f3b466742b3',uuid=c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.409 183284 DEBUG nova.network.os_vif_util [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.409 183284 DEBUG nova.network.os_vif_util [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:52:84,bridge_name='br-int',has_traffic_filtering=True,id=e14186c1-07e6-4057-97ee-4c005c167ad0,network=Network(9c57fa57-0050-4bf4-8378-976d31aaf23b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14186c1-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.410 183284 DEBUG nova.virt.libvirt.migration [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 18:44:34 compute-0 nova_compute[183278]:   <mac address="fa:16:3e:4a:52:84"/>
Jan 21 18:44:34 compute-0 nova_compute[183278]:   <model type="virtio"/>
Jan 21 18:44:34 compute-0 nova_compute[183278]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:44:34 compute-0 nova_compute[183278]:   <mtu size="1442"/>
Jan 21 18:44:34 compute-0 nova_compute[183278]:   <target dev="tape14186c1-07"/>
Jan 21 18:44:34 compute-0 nova_compute[183278]: </interface>
Jan 21 18:44:34 compute-0 nova_compute[183278]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.410 183284 DEBUG nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.895 183284 DEBUG nova.virt.libvirt.migration [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.895 183284 INFO nova.virt.libvirt.migration [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 18:44:34 compute-0 nova_compute[183278]: 2026-01-21 18:44:34.969 183284 INFO nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.472 183284 DEBUG nova.virt.libvirt.migration [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.473 183284 DEBUG nova.virt.libvirt.migration [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.698 183284 DEBUG nova.virt.driver [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] Emitting event <LifecycleEvent: 1769021075.6983538, c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.699 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] VM Paused (Lifecycle Event)
Jan 21 18:44:35 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:44:35 compute-0 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.719 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.723 183284 DEBUG nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.740 183284 INFO nova.compute.manager [None req-047ddc9f-12f7-41a6-ba0f-8f2c7a519c5b - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:35 compute-0 kernel: tape14186c1-07 (unregistering): left promiscuous mode
Jan 21 18:44:35 compute-0 NetworkManager[55506]: <info>  [1769021075.8493] device (tape14186c1-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:44:35 compute-0 ovn_controller[95419]: 2026-01-21T18:44:35Z|00212|binding|INFO|Releasing lport e14186c1-07e6-4057-97ee-4c005c167ad0 from this chassis (sb_readonly=0)
Jan 21 18:44:35 compute-0 ovn_controller[95419]: 2026-01-21T18:44:35Z|00213|binding|INFO|Setting lport e14186c1-07e6-4057-97ee-4c005c167ad0 down in Southbound
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.859 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:35 compute-0 ovn_controller[95419]: 2026-01-21T18:44:35Z|00214|binding|INFO|Removing iface tape14186c1-07 ovn-installed in OVS
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.861 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:35 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:35.866 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:52:84 10.100.0.8'], port_security=['fa:16:3e:4a:52:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '88a62794-b4a4-47e3-9cce-91e574e684c1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c57fa57-0050-4bf4-8378-976d31aaf23b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ede9c1725ac4e8ea38db9268265acb5', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3b8e32c8-3299-4f4d-8345-c78bc6e7466e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9133fe9e-7d02-4b64-9125-410a8dad613d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>], logical_port=e14186c1-07e6-4057-97ee-4c005c167ad0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f45e8ec7b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:44:35 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:35.867 104698 INFO neutron.agent.ovn.metadata.agent [-] Port e14186c1-07e6-4057-97ee-4c005c167ad0 in datapath 9c57fa57-0050-4bf4-8378-976d31aaf23b unbound from our chassis
Jan 21 18:44:35 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:35.869 104698 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c57fa57-0050-4bf4-8378-976d31aaf23b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 18:44:35 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:35.870 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[e01833c0-7886-4c0b-aec4-a977fcb86a3d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:35 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:35.871 104698 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b namespace which is not needed anymore
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.879 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:35 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 21 18:44:35 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001d.scope: Consumed 13.475s CPU time.
Jan 21 18:44:35 compute-0 systemd-machined[154592]: Machine qemu-20-instance-0000001d terminated.
Jan 21 18:44:35 compute-0 podman[214020]: 2026-01-21 18:44:35.938212785 +0000 UTC m=+0.061069977 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:44:35 compute-0 podman[214017]: 2026-01-21 18:44:35.962342998 +0000 UTC m=+0.086097702 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.977 183284 DEBUG nova.compute.manager [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.977 183284 DEBUG oslo_concurrency.lockutils [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.977 183284 DEBUG oslo_concurrency.lockutils [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.977 183284 DEBUG oslo_concurrency.lockutils [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.978 183284 DEBUG nova.compute.manager [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] No waiting events found dispatching network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.978 183284 WARNING nova.compute.manager [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received unexpected event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 for instance with vm_state active and task_state migrating.
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.978 183284 DEBUG nova.compute.manager [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-changed-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.979 183284 DEBUG nova.compute.manager [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Refreshing instance network info cache due to event network-changed-e14186c1-07e6-4057-97ee-4c005c167ad0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.979 183284 DEBUG oslo_concurrency.lockutils [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.979 183284 DEBUG oslo_concurrency.lockutils [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquired lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 18:44:35 compute-0 nova_compute[183278]: 2026-01-21 18:44:35.979 183284 DEBUG nova.network.neutron [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Refreshing network info cache for port e14186c1-07e6-4057-97ee-4c005c167ad0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 18:44:35 compute-0 neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b[213928]: [NOTICE]   (213932) : haproxy version is 2.8.14-c23fe91
Jan 21 18:44:35 compute-0 neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b[213928]: [NOTICE]   (213932) : path to executable is /usr/sbin/haproxy
Jan 21 18:44:35 compute-0 neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b[213928]: [WARNING]  (213932) : Exiting Master process...
Jan 21 18:44:35 compute-0 neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b[213928]: [WARNING]  (213932) : Exiting Master process...
Jan 21 18:44:35 compute-0 neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b[213928]: [ALERT]    (213932) : Current worker (213934) exited with code 143 (Terminated)
Jan 21 18:44:35 compute-0 neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b[213928]: [WARNING]  (213932) : All workers exited. Exiting... (0)
Jan 21 18:44:35 compute-0 systemd[1]: libpod-cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98.scope: Deactivated successfully.
Jan 21 18:44:36 compute-0 podman[214078]: 2026-01-21 18:44:36.004893338 +0000 UTC m=+0.042172252 container died cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:44:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98-userdata-shm.mount: Deactivated successfully.
Jan 21 18:44:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-1078ead15b21aaf98f835bb3f762847d931b4aeb0b8d6adf94927f6401742fc9-merged.mount: Deactivated successfully.
Jan 21 18:44:36 compute-0 podman[214078]: 2026-01-21 18:44:36.037844874 +0000 UTC m=+0.075123788 container cleanup cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:44:36 compute-0 systemd[1]: libpod-conmon-cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98.scope: Deactivated successfully.
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.045 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.050 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.087 183284 DEBUG nova.virt.libvirt.guest [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.087 183284 INFO nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Migration operation has completed
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.088 183284 INFO nova.compute.manager [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] _post_live_migration() is started..
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.093 183284 DEBUG nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.093 183284 DEBUG nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.094 183284 DEBUG nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 18:44:36 compute-0 podman[214115]: 2026-01-21 18:44:36.110146752 +0000 UTC m=+0.045444090 container remove cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 18:44:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:36.115 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[5bdbc1b9-ecb0-4cd4-b65e-809717cb9d22]: (4, ('Wed Jan 21 06:44:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b (cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98)\ncb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98\nWed Jan 21 06:44:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b (cb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98)\ncb8f3ea829023eab531b3968b9ecb2f24b5c0867d0480fd463bf5c866e7bab98\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:36.117 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[a273ae0c-492e-4f66-a248-7a2be80a30b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:36.118 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c57fa57-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:44:36 compute-0 kernel: tap9c57fa57-00: left promiscuous mode
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.119 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.181 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:36.184 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[7903d05c-d702-4650-b0ea-9f2f219e2d21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:36.198 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c0cfe0-dea3-4cba-9134-c74846a92ac8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:36.199 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[4f1480e4-5e23-43d2-8000-e2862eda774c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:36.213 203892 DEBUG oslo.privsep.daemon [-] privsep: reply[ac26b0bd-a146-4de9-b002-87e2c5172fac]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562241, 'reachable_time': 21841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214147, 'error': None, 'target': 'ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:36 compute-0 systemd[1]: run-netns-ovnmeta\x2d9c57fa57\x2d0050\x2d4bf4\x2d8378\x2d976d31aaf23b.mount: Deactivated successfully.
Jan 21 18:44:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:36.215 105036 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c57fa57-0050-4bf4-8378-976d31aaf23b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 18:44:36 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:36.216 105036 DEBUG oslo.privsep.daemon [-] privsep: reply[2f5fa4c1-6d94-4fbe-8994-1e5a7a7ad42f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 18:44:36 compute-0 sshd-session[214148]: Invalid user ubuntu from 64.227.98.100 port 58082
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:36 compute-0 sshd-session[214148]: Connection closed by invalid user ubuntu 64.227.98.100 port 58082 [preauth]
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:36 compute-0 nova_compute[183278]: 2026-01-21 18:44:36.835 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:37 compute-0 nova_compute[183278]: 2026-01-21 18:44:37.525 183284 DEBUG nova.network.neutron [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Updated VIF entry in instance network info cache for port e14186c1-07e6-4057-97ee-4c005c167ad0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 18:44:37 compute-0 nova_compute[183278]: 2026-01-21 18:44:37.525 183284 DEBUG nova.network.neutron [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Updating instance_info_cache with network_info: [{"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 18:44:37 compute-0 nova_compute[183278]: 2026-01-21 18:44:37.552 183284 DEBUG oslo_concurrency.lockutils [req-63fd5323-69d3-477c-9bab-dfb1e2e96f2a req-ba021c61-dda1-4e40-b1d0-4d66874e6eb8 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Releasing lock "refresh_cache-c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.001 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.051 183284 DEBUG nova.network.neutron [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Activated binding for port e14186c1-07e6-4057-97ee-4c005c167ad0 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.052 183284 DEBUG nova.compute.manager [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.053 183284 DEBUG nova.virt.libvirt.vif [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T18:44:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-417289613',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-417289613',id=29,image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T18:44:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6ede9c1725ac4e8ea38db9268265acb5',ramdisk_id='',reservation_id='r-72t15kiz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='672306ae-5521-4fc1-a825-a16d6d125c61',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1786177469',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1786177469-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T18:44:27Z,user_data=None,user_id='822dc3e9895a43a190ab7f3b466742b3',uuid=c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.053 183284 DEBUG nova.network.os_vif_util [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converting VIF {"id": "e14186c1-07e6-4057-97ee-4c005c167ad0", "address": "fa:16:3e:4a:52:84", "network": {"id": "9c57fa57-0050-4bf4-8378-976d31aaf23b", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-751634950-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ede9c1725ac4e8ea38db9268265acb5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14186c1-07", "ovs_interfaceid": "e14186c1-07e6-4057-97ee-4c005c167ad0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.054 183284 DEBUG nova.network.os_vif_util [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:52:84,bridge_name='br-int',has_traffic_filtering=True,id=e14186c1-07e6-4057-97ee-4c005c167ad0,network=Network(9c57fa57-0050-4bf4-8378-976d31aaf23b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14186c1-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.054 183284 DEBUG os_vif [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:52:84,bridge_name='br-int',has_traffic_filtering=True,id=e14186c1-07e6-4057-97ee-4c005c167ad0,network=Network(9c57fa57-0050-4bf4-8378-976d31aaf23b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14186c1-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.055 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.056 183284 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape14186c1-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.057 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.058 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.060 183284 INFO os_vif [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:52:84,bridge_name='br-int',has_traffic_filtering=True,id=e14186c1-07e6-4057-97ee-4c005c167ad0,network=Network(9c57fa57-0050-4bf4-8378-976d31aaf23b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14186c1-07')
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.061 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.061 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.061 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.062 183284 DEBUG nova.compute.manager [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.062 183284 INFO nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Deleting instance files /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2_del
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.063 183284 INFO nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Deletion of /var/lib/nova/instances/c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2_del complete
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.205 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-unplugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.205 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.205 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.206 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.206 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] No waiting events found dispatching network-vif-unplugged-e14186c1-07e6-4057-97ee-4c005c167ad0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.206 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-unplugged-e14186c1-07e6-4057-97ee-4c005c167ad0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.206 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.207 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.207 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.207 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.207 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] No waiting events found dispatching network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.207 183284 WARNING nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received unexpected event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 for instance with vm_state active and task_state migrating.
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.208 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.208 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.208 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.208 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.209 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] No waiting events found dispatching network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.209 183284 WARNING nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received unexpected event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 for instance with vm_state active and task_state migrating.
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.209 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-unplugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.209 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.210 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.210 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.210 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] No waiting events found dispatching network-vif-unplugged-e14186c1-07e6-4057-97ee-4c005c167ad0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.210 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-unplugged-e14186c1-07e6-4057-97ee-4c005c167ad0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.210 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.211 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.211 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.211 183284 DEBUG oslo_concurrency.lockutils [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.211 183284 DEBUG nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] No waiting events found dispatching network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:44:38 compute-0 nova_compute[183278]: 2026-01-21 18:44:38.212 183284 WARNING nova.compute.manager [req-57bd7559-938c-4f6d-aef5-75fee743c5fd req-522069a9-c0e6-45b0-9ca1-b83687a5ff5f 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received unexpected event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 for instance with vm_state active and task_state migrating.
Jan 21 18:44:39 compute-0 nova_compute[183278]: 2026-01-21 18:44:39.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.056 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.057 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.057 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.057 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.230 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.231 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5753MB free_disk=73.37879180908203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.231 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.232 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.596 183284 DEBUG nova.compute.manager [req-104e93ef-3873-439f-a607-bfc5fdce136f req-5e495f11-a118-441a-81a6-79b0e81fc80e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.596 183284 DEBUG oslo_concurrency.lockutils [req-104e93ef-3873-439f-a607-bfc5fdce136f req-5e495f11-a118-441a-81a6-79b0e81fc80e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.597 183284 DEBUG oslo_concurrency.lockutils [req-104e93ef-3873-439f-a607-bfc5fdce136f req-5e495f11-a118-441a-81a6-79b0e81fc80e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.597 183284 DEBUG oslo_concurrency.lockutils [req-104e93ef-3873-439f-a607-bfc5fdce136f req-5e495f11-a118-441a-81a6-79b0e81fc80e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.597 183284 DEBUG nova.compute.manager [req-104e93ef-3873-439f-a607-bfc5fdce136f req-5e495f11-a118-441a-81a6-79b0e81fc80e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] No waiting events found dispatching network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.598 183284 WARNING nova.compute.manager [req-104e93ef-3873-439f-a607-bfc5fdce136f req-5e495f11-a118-441a-81a6-79b0e81fc80e 8a1020e43db744408858e4e0e959780b c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Received unexpected event network-vif-plugged-e14186c1-07e6-4057-97ee-4c005c167ad0 for instance with vm_state active and task_state migrating.
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.789 183284 INFO nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Updating resource usage from migration a52012a1-9293-4578-a442-fcb6740ebc5a
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.824 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Migration a52012a1-9293-4578-a442-fcb6740ebc5a is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.825 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.825 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.875 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.897 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.921 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:44:40 compute-0 nova_compute[183278]: 2026-01-21 18:44:40.921 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:41 compute-0 nova_compute[183278]: 2026-01-21 18:44:41.837 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:41 compute-0 nova_compute[183278]: 2026-01-21 18:44:41.921 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:41 compute-0 podman[214151]: 2026-01-21 18:44:41.994346367 +0000 UTC m=+0.048736189 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:44:42 compute-0 nova_compute[183278]: 2026-01-21 18:44:42.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.058 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:43 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:44:43 compute-0 systemd[213994]: Activating special unit Exit the Session...
Jan 21 18:44:43 compute-0 systemd[213994]: Stopped target Main User Target.
Jan 21 18:44:43 compute-0 systemd[213994]: Stopped target Basic System.
Jan 21 18:44:43 compute-0 systemd[213994]: Stopped target Paths.
Jan 21 18:44:43 compute-0 systemd[213994]: Stopped target Sockets.
Jan 21 18:44:43 compute-0 systemd[213994]: Stopped target Timers.
Jan 21 18:44:43 compute-0 systemd[213994]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:44:43 compute-0 systemd[213994]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:44:43 compute-0 systemd[213994]: Closed D-Bus User Message Bus Socket.
Jan 21 18:44:43 compute-0 systemd[213994]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:44:43 compute-0 systemd[213994]: Removed slice User Application Slice.
Jan 21 18:44:43 compute-0 systemd[213994]: Reached target Shutdown.
Jan 21 18:44:43 compute-0 systemd[213994]: Finished Exit the Session.
Jan 21 18:44:43 compute-0 systemd[213994]: Reached target Exit the Session.
Jan 21 18:44:43 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:44:43 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:44:43 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:44:43 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:44:43 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:44:43 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:44:43 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.570 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.571 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.571 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.590 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.590 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.590 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.591 183284 DEBUG nova.compute.resource_tracker [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.716 183284 WARNING nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.717 183284 DEBUG nova.compute.resource_tracker [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5759MB free_disk=73.37879180908203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.717 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.718 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.759 183284 DEBUG nova.compute.resource_tracker [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration for instance c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.781 183284 DEBUG nova.compute.resource_tracker [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.799 183284 DEBUG nova.compute.resource_tracker [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Migration a52012a1-9293-4578-a442-fcb6740ebc5a is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.799 183284 DEBUG nova.compute.resource_tracker [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.799 183284 DEBUG nova.compute.resource_tracker [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.834 183284 DEBUG nova.compute.provider_tree [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.856 183284 DEBUG nova.scheduler.client.report [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.879 183284 DEBUG nova.compute.resource_tracker [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.879 183284 DEBUG oslo_concurrency.lockutils [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.883 183284 INFO nova.compute.manager [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.969 183284 INFO nova.scheduler.client.report [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] Deleted allocation for migration a52012a1-9293-4578-a442-fcb6740ebc5a
Jan 21 18:44:43 compute-0 nova_compute[183278]: 2026-01-21 18:44:43.970 183284 DEBUG nova.virt.libvirt.driver [None req-0cbd730a-c6a4-4ffa-bc3b-ab4948b4c1be 02fd3489ce5048c79016ced1d1f128bf c659cd11960649e6bd82a4a523928a72 - - default default] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 18:44:45 compute-0 nova_compute[183278]: 2026-01-21 18:44:45.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:45 compute-0 nova_compute[183278]: 2026-01-21 18:44:45.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:44:46 compute-0 nova_compute[183278]: 2026-01-21 18:44:46.838 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:48 compute-0 nova_compute[183278]: 2026-01-21 18:44:48.060 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:51 compute-0 nova_compute[183278]: 2026-01-21 18:44:51.088 183284 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769021076.0861273, c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 18:44:51 compute-0 nova_compute[183278]: 2026-01-21 18:44:51.089 183284 INFO nova.compute.manager [-] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] VM Stopped (Lifecycle Event)
Jan 21 18:44:51 compute-0 nova_compute[183278]: 2026-01-21 18:44:51.176 183284 DEBUG nova.compute.manager [None req-64a5d76a-8397-430c-900d-5a530d7e4915 - - - - - -] [instance: c59e91fb-6db5-4de3-ab9b-69b2ce8b60d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 18:44:51 compute-0 nova_compute[183278]: 2026-01-21 18:44:51.839 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:52 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:52.348 104698 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'e2:aa:66', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f6:39:87:73:c8'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 18:44:52 compute-0 nova_compute[183278]: 2026-01-21 18:44:52.349 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:52 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:52.349 104698 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 18:44:53 compute-0 nova_compute[183278]: 2026-01-21 18:44:53.102 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:54 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:44:54.352 104698 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a4db021d-a451-4e5f-8011-49af760bda68, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 18:44:56 compute-0 nova_compute[183278]: 2026-01-21 18:44:56.842 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:58 compute-0 nova_compute[183278]: 2026-01-21 18:44:58.104 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:44:59 compute-0 podman[192560]: time="2026-01-21T18:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:44:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:44:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.817 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.818 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.819 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.819 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.819 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.819 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.847 183284 DEBUG nova.virt.libvirt.imagecache [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.848 183284 WARNING nova.virt.libvirt.imagecache [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Unknown base file: /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.848 183284 INFO nova.virt.libvirt.imagecache [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Removable base files: /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.849 183284 INFO nova.virt.libvirt.imagecache [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/8bf45782a806e8f2f684ae874be9ab99d891a685
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.849 183284 DEBUG nova.virt.libvirt.imagecache [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.849 183284 DEBUG nova.virt.libvirt.imagecache [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 21 18:44:59 compute-0 nova_compute[183278]: 2026-01-21 18:44:59.849 183284 DEBUG nova.virt.libvirt.imagecache [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 21 18:45:00 compute-0 podman[214179]: 2026-01-21 18:45:00.013926244 +0000 UTC m=+0.055662357 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, name=ubi9-minimal, vcs-type=git)
Jan 21 18:45:01 compute-0 openstack_network_exporter[195402]: ERROR   18:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:45:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:45:01 compute-0 openstack_network_exporter[195402]: ERROR   18:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:45:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:45:01 compute-0 nova_compute[183278]: 2026-01-21 18:45:01.844 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:03 compute-0 nova_compute[183278]: 2026-01-21 18:45:03.106 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:06 compute-0 nova_compute[183278]: 2026-01-21 18:45:06.846 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:06 compute-0 podman[214201]: 2026-01-21 18:45:06.99641934 +0000 UTC m=+0.048542475 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 21 18:45:07 compute-0 podman[214200]: 2026-01-21 18:45:07.016576517 +0000 UTC m=+0.072885573 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:45:08 compute-0 nova_compute[183278]: 2026-01-21 18:45:08.108 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:11 compute-0 nova_compute[183278]: 2026-01-21 18:45:11.848 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:12 compute-0 podman[214244]: 2026-01-21 18:45:12.986417073 +0000 UTC m=+0.046718590 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:45:13 compute-0 nova_compute[183278]: 2026-01-21 18:45:13.226 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:16 compute-0 nova_compute[183278]: 2026-01-21 18:45:16.848 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:18 compute-0 nova_compute[183278]: 2026-01-21 18:45:18.228 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:45:20.118 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:45:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:45:20.119 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:45:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:45:20.119 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:45:21 compute-0 nova_compute[183278]: 2026-01-21 18:45:21.850 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:22 compute-0 ovn_controller[95419]: 2026-01-21T18:45:22Z|00215|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 21 18:45:23 compute-0 nova_compute[183278]: 2026-01-21 18:45:23.291 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:26 compute-0 nova_compute[183278]: 2026-01-21 18:45:26.851 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:28 compute-0 nova_compute[183278]: 2026-01-21 18:45:28.050 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:28 compute-0 nova_compute[183278]: 2026-01-21 18:45:28.292 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:29 compute-0 podman[192560]: time="2026-01-21T18:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:45:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:45:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 21 18:45:30 compute-0 nova_compute[183278]: 2026-01-21 18:45:30.849 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:30 compute-0 nova_compute[183278]: 2026-01-21 18:45:30.850 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:45:30 compute-0 nova_compute[183278]: 2026-01-21 18:45:30.850 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:45:30 compute-0 nova_compute[183278]: 2026-01-21 18:45:30.871 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:45:30 compute-0 podman[214269]: 2026-01-21 18:45:30.995284313 +0000 UTC m=+0.051496147 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 18:45:31 compute-0 openstack_network_exporter[195402]: ERROR   18:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:45:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:45:31 compute-0 openstack_network_exporter[195402]: ERROR   18:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:45:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:45:31 compute-0 nova_compute[183278]: 2026-01-21 18:45:31.853 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:33 compute-0 nova_compute[183278]: 2026-01-21 18:45:33.296 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:35 compute-0 nova_compute[183278]: 2026-01-21 18:45:35.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:36 compute-0 nova_compute[183278]: 2026-01-21 18:45:36.815 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:36 compute-0 nova_compute[183278]: 2026-01-21 18:45:36.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:36 compute-0 nova_compute[183278]: 2026-01-21 18:45:36.854 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:37 compute-0 nova_compute[183278]: 2026-01-21 18:45:37.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:37 compute-0 podman[214292]: 2026-01-21 18:45:37.995400103 +0000 UTC m=+0.046171867 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 18:45:38 compute-0 podman[214291]: 2026-01-21 18:45:38.021157655 +0000 UTC m=+0.072283137 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:45:38 compute-0 nova_compute[183278]: 2026-01-21 18:45:38.297 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:39 compute-0 nova_compute[183278]: 2026-01-21 18:45:39.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:39 compute-0 nova_compute[183278]: 2026-01-21 18:45:39.947 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:45:39 compute-0 nova_compute[183278]: 2026-01-21 18:45:39.947 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:45:39 compute-0 nova_compute[183278]: 2026-01-21 18:45:39.947 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:45:39 compute-0 nova_compute[183278]: 2026-01-21 18:45:39.947 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.076 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.077 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5837MB free_disk=73.37878799438477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.077 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.077 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.495 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.495 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.514 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.532 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.534 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.534 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 18:45:40 compute-0 nova_compute[183278]: 2026-01-21 18:45:40.840 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 18:45:41 compute-0 nova_compute[183278]: 2026-01-21 18:45:41.839 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:41 compute-0 nova_compute[183278]: 2026-01-21 18:45:41.855 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:42 compute-0 nova_compute[183278]: 2026-01-21 18:45:42.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:42 compute-0 nova_compute[183278]: 2026-01-21 18:45:42.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 18:45:43 compute-0 nova_compute[183278]: 2026-01-21 18:45:43.301 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:43 compute-0 nova_compute[183278]: 2026-01-21 18:45:43.838 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:44 compute-0 podman[214334]: 2026-01-21 18:45:44.017107014 +0000 UTC m=+0.073214351 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:45:45 compute-0 nova_compute[183278]: 2026-01-21 18:45:45.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:45 compute-0 nova_compute[183278]: 2026-01-21 18:45:45.816 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:45:46 compute-0 nova_compute[183278]: 2026-01-21 18:45:46.857 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:48 compute-0 nova_compute[183278]: 2026-01-21 18:45:48.305 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:48 compute-0 nova_compute[183278]: 2026-01-21 18:45:48.811 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:45:51 compute-0 nova_compute[183278]: 2026-01-21 18:45:51.858 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:53 compute-0 nova_compute[183278]: 2026-01-21 18:45:53.307 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:56 compute-0 nova_compute[183278]: 2026-01-21 18:45:56.860 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:58 compute-0 nova_compute[183278]: 2026-01-21 18:45:58.310 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:45:59 compute-0 podman[192560]: time="2026-01-21T18:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:45:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:45:59 compute-0 podman[192560]: @ - - [21/Jan/2026:18:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Jan 21 18:46:01 compute-0 openstack_network_exporter[195402]: ERROR   18:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:46:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:46:01 compute-0 openstack_network_exporter[195402]: ERROR   18:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:46:01 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:46:01 compute-0 anacron[143712]: Job `cron.monthly' started
Jan 21 18:46:01 compute-0 anacron[143712]: Job `cron.monthly' terminated
Jan 21 18:46:01 compute-0 anacron[143712]: Normal exit (3 jobs run)
Jan 21 18:46:01 compute-0 podman[214361]: 2026-01-21 18:46:01.635069633 +0000 UTC m=+0.056431235 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=)
Jan 21 18:46:01 compute-0 nova_compute[183278]: 2026-01-21 18:46:01.817 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:01 compute-0 nova_compute[183278]: 2026-01-21 18:46:01.862 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:03 compute-0 nova_compute[183278]: 2026-01-21 18:46:03.312 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:05 compute-0 ovn_controller[95419]: 2026-01-21T18:46:05Z|00216|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 21 18:46:06 compute-0 nova_compute[183278]: 2026-01-21 18:46:06.864 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:08 compute-0 nova_compute[183278]: 2026-01-21 18:46:08.315 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:08 compute-0 podman[214384]: 2026-01-21 18:46:08.998288043 +0000 UTC m=+0.047876009 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:46:09 compute-0 podman[214383]: 2026-01-21 18:46:09.019309671 +0000 UTC m=+0.074927933 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:46:11 compute-0 nova_compute[183278]: 2026-01-21 18:46:11.865 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:13 compute-0 nova_compute[183278]: 2026-01-21 18:46:13.317 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:15 compute-0 podman[214428]: 2026-01-21 18:46:15.019654924 +0000 UTC m=+0.074689166 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:46:16 compute-0 nova_compute[183278]: 2026-01-21 18:46:16.907 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:18 compute-0 nova_compute[183278]: 2026-01-21 18:46:18.321 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:46:20.119 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:46:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:46:20.120 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:46:20 compute-0 ovn_metadata_agent[104693]: 2026-01-21 18:46:20.120 104698 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:46:21 compute-0 nova_compute[183278]: 2026-01-21 18:46:21.909 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:23 compute-0 nova_compute[183278]: 2026-01-21 18:46:23.323 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:26 compute-0 nova_compute[183278]: 2026-01-21 18:46:26.909 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:28 compute-0 nova_compute[183278]: 2026-01-21 18:46:28.326 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:29 compute-0 podman[192560]: time="2026-01-21T18:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:46:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15354 "" "Go-http-client/1.1"
Jan 21 18:46:29 compute-0 podman[192560]: @ - - [21/Jan/2026:18:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Jan 21 18:46:30 compute-0 nova_compute[183278]: 2026-01-21 18:46:30.845 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:30 compute-0 nova_compute[183278]: 2026-01-21 18:46:30.845 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 18:46:30 compute-0 nova_compute[183278]: 2026-01-21 18:46:30.846 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 18:46:30 compute-0 nova_compute[183278]: 2026-01-21 18:46:30.865 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 18:46:31 compute-0 openstack_network_exporter[195402]: ERROR   18:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 21 18:46:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:46:31 compute-0 openstack_network_exporter[195402]: ERROR   18:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 21 18:46:31 compute-0 openstack_network_exporter[195402]: 
Jan 21 18:46:31 compute-0 nova_compute[183278]: 2026-01-21 18:46:31.910 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:31 compute-0 podman[214452]: 2026-01-21 18:46:31.987282482 +0000 UTC m=+0.049204756 container health_status 9341e102103e4f08e892499432850c0fb857a4c2afb178f82d393fd76e3f3c4b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible)
Jan 21 18:46:33 compute-0 nova_compute[183278]: 2026-01-21 18:46:33.330 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:33 compute-0 nova_compute[183278]: 2026-01-21 18:46:33.707 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:36 compute-0 nova_compute[183278]: 2026-01-21 18:46:36.956 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:37 compute-0 nova_compute[183278]: 2026-01-21 18:46:37.845 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:37 compute-0 nova_compute[183278]: 2026-01-21 18:46:37.845 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:38 compute-0 sshd-session[214474]: Accepted publickey for zuul from 192.168.122.10 port 45492 ssh2: ECDSA SHA256:cKen23vhgWniT1Xii+Y+iYDmAKCwydOnjbSSWnKmVRE
Jan 21 18:46:38 compute-0 systemd-logind[782]: New session 46 of user zuul.
Jan 21 18:46:38 compute-0 systemd[1]: Started Session 46 of User zuul.
Jan 21 18:46:38 compute-0 sshd-session[214474]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 18:46:38 compute-0 sudo[214478]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 21 18:46:38 compute-0 sudo[214478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 18:46:38 compute-0 nova_compute[183278]: 2026-01-21 18:46:38.332 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:38 compute-0 nova_compute[183278]: 2026-01-21 18:46:38.812 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:38 compute-0 nova_compute[183278]: 2026-01-21 18:46:38.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:39 compute-0 podman[214513]: 2026-01-21 18:46:39.255390542 +0000 UTC m=+0.056006482 container health_status db4be7d3093273cd96db9494c4f2b0960d918b418e0c6d6b0c80258d2ac1b456 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 18:46:39 compute-0 podman[214512]: 2026-01-21 18:46:39.279283647 +0000 UTC m=+0.078716968 container health_status 16da501c1175ec0513d60666f7acf1ace63ffa0f098637e75a47920f1bce507a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7926ac609a5cd8e3f5cd322ca64f951c35dc79b9937dd264190afe22a8d871a9-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.847 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.848 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.848 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.848 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.958 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.992 183284 WARNING nova.virt.libvirt.driver [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.993 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5760MB free_disk=73.37870407104492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.993 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 18:46:41 compute-0 nova_compute[183278]: 2026-01-21 18:46:41.993 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 18:46:42 compute-0 nova_compute[183278]: 2026-01-21 18:46:42.162 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 18:46:42 compute-0 nova_compute[183278]: 2026-01-21 18:46:42.162 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 18:46:42 compute-0 nova_compute[183278]: 2026-01-21 18:46:42.191 183284 DEBUG nova.compute.provider_tree [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed in ProviderTree for provider: 502e4243-611b-433d-a766-9b485d51652d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 18:46:42 compute-0 nova_compute[183278]: 2026-01-21 18:46:42.206 183284 DEBUG nova.scheduler.client.report [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Inventory has not changed for provider 502e4243-611b-433d-a766-9b485d51652d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 18:46:42 compute-0 nova_compute[183278]: 2026-01-21 18:46:42.208 183284 DEBUG nova.compute.resource_tracker [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 18:46:42 compute-0 nova_compute[183278]: 2026-01-21 18:46:42.208 183284 DEBUG oslo_concurrency.lockutils [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 18:46:42 compute-0 ovs-vsctl[214691]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 21 18:46:43 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 214502 (sos)
Jan 21 18:46:43 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 21 18:46:43 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 21 18:46:43 compute-0 nova_compute[183278]: 2026-01-21 18:46:43.335 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:43 compute-0 virtqemud[182681]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 21 18:46:43 compute-0 virtqemud[182681]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 21 18:46:43 compute-0 virtqemud[182681]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 21 18:46:44 compute-0 nova_compute[183278]: 2026-01-21 18:46:44.208 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:44 compute-0 crontab[215099]: (root) LIST (root)
Jan 21 18:46:44 compute-0 nova_compute[183278]: 2026-01-21 18:46:44.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:46 compute-0 podman[215178]: 2026-01-21 18:46:46.016771845 +0000 UTC m=+0.063558092 container health_status 602390e7add4f0887a756c79efce74830bf48cef60be280b8f29ef412c643feb (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6c7afe8a678f9e5773827ab45a6fec4658b6c3deb994ca79a609988ee8d22dc5-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:46:46 compute-0 systemd[1]: Starting Hostname Service...
Jan 21 18:46:46 compute-0 systemd[1]: Started Hostname Service.
Jan 21 18:46:46 compute-0 nova_compute[183278]: 2026-01-21 18:46:46.816 183284 DEBUG oslo_service.periodic_task [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 18:46:46 compute-0 nova_compute[183278]: 2026-01-21 18:46:46.817 183284 DEBUG nova.compute.manager [None req-d3e5cd76-e39e-4fc1-99ce-2faa5fd6e7ab - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 18:46:46 compute-0 nova_compute[183278]: 2026-01-21 18:46:46.988 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 18:46:48 compute-0 nova_compute[183278]: 2026-01-21 18:46:48.339 183284 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
